ashutosh nayakTricks of Getting Desired Output from LLMs in RAGs: Purely AnecdotalThere are multiple sources on learning “Prompt Engineering”. This blog is not on “How to prompt” and I assume you already know the basics…Mar 17Mar 17
ashutosh nayakBrief Introduction to Different Types of Prompting for LLMsPrompt engineering aims at designing prompts to get the desired output from LLMs. While prompt engineering is vast, this blog briefly…Mar 17Mar 17
ashutosh nayakDecision Tree: What is Information Gain Criteria to Split NodesThis is a short blog on understanding the idea behind splitting a tree at a node (into two sub-branches). One of the criteria is…Aug 19, 2022Aug 19, 2022
ashutosh nayakinTowards Data Sciencep-Value and Power of a TestIdea of p-ValueSep 1, 20201Sep 1, 20201
ashutosh nayakinTowards Data ScienceAkaike Information CriteriaThe idea behind AICMar 13, 2020Mar 13, 2020
ashutosh nayakinTowards Data ScienceIdea Behind LIME and SHAPIntuition behind ML interpretation modelsDec 22, 20192Dec 22, 20192
ashutosh nayakinTowards Data ScienceXGBoost: An Intuitive ExplanationHow XGBoost trees are constructedDec 17, 2019Dec 17, 2019
ashutosh nayakinTowards Data ScienceCross-entropy: From an Information theory point of viewConnecting Information theory with cross-entropy loss functionJun 22, 2019Jun 22, 2019
ashutosh nayakinTowards Data ScienceDealing with Type II EndogeneityExamples from the literature dealing with Type II endogeneityJun 2, 2019Jun 2, 2019
ashutosh nayakinTowards Data ScienceDealing with Type I endogeneityUsing ice-cream vendor example to explain Type I endogeneityJun 2, 2019Jun 2, 2019