PinnedMarko VidrihinGoPenAIMixture of Experts (MoE) in AI Models ExplainedThe Mixture of Experts (MoE) is offering a unique approach to efficiently scaling models while maintaining, or even improving, their…5 min read·Dec 12, 2023--1--1
PinnedMarko VidrihDo You Need a Zero Knowledge Proof?Where exactly do zero-knowledge proofs fit in?6 min read·Feb 28, 2024----
PinnedMarko VidrihCrypto Trends Report 2024A full report of crypto trends in 2024 by Marko Vidrih2 min read·Mar 14, 2024----
PinnedMarko VidrihinGoPenAIMulti-Agent LLM: Harnessing Large Language Models for the Generation of Artificial ExpertsWhat Happens When Multiple AIs Talk to Each Other?3 min read·Oct 5, 2023----
Marko VidrihMistral Fine-Tuning API: Here’s What You Need To KnowMistral API for fine-tuning Mistral 7B and Mistral Small models2 min read·1 day ago----
Marko VidrihMamba-2 is Out: Can it replace Transformers?Mamba-2: A new state space model architecture that outperforms Mamba and Transformer++3 min read·1 day ago----
Marko VidrihThe Easiest Way to Run Llama 3 LocallyDownload, install, and type one command in the terminal to start using Llama 3 on your laptop.3 min read·May 17, 2024----
Marko VidrihNew Dawn of Web3 Security: Veritas Protocol ReviewThe vast expanse of Web3, where innovation thrives alongside vulnerability, demands a stalwart protector. Veritas Protocol, not just…6 min read·Apr 25, 2024----
Marko VidrihCrypto Trends 2024 Report: AI and Crypto SynergyYou can get full report here.4 min read·Apr 8, 2024----
Marko VidrihCrypto Trends 2024 Report: NFT (R)evolutionYou can get full report here.3 min read·Apr 8, 2024----