<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Product Decisions: Bytes]]></title><description><![CDATA[Concepts explained simply]]></description><link>https://www.nitinmadeshia.com/s/bytes</link><generator>Substack</generator><lastBuildDate>Sat, 09 May 2026 02:23:54 GMT</lastBuildDate><atom:link href="https://www.nitinmadeshia.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Nitin Madeshia]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[nitinmadeshia@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[nitinmadeshia@substack.com]]></itunes:email><itunes:name><![CDATA[Nitin Madeshia]]></itunes:name></itunes:owner><itunes:author><![CDATA[Nitin Madeshia]]></itunes:author><googleplay:owner><![CDATA[nitinmadeshia@substack.com]]></googleplay:owner><googleplay:email><![CDATA[nitinmadeshia@substack.com]]></googleplay:email><googleplay:author><![CDATA[Nitin Madeshia]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[What Does "7 Billion Parameters" Actually Mean in AI? ]]></title><description><![CDATA[You&#8217;ve probably seen these phrases thrown around in AI conversations:]]></description><link>https://www.nitinmadeshia.com/p/what-does-7-billion-parameters-actually</link><guid isPermaLink="false">https://www.nitinmadeshia.com/p/what-does-7-billion-parameters-actually</guid><dc:creator><![CDATA[Nitin Madeshia]]></dc:creator><pubDate>Mon, 07 Jul 2025 17:54:46 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!umys!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!umys!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!umys!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!umys!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!umys!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!umys!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!umys!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2443832,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.nitinmadeshia.com/i/167744302?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!umys!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!umys!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!umys!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!umys!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1e2e98b-ea36-4557-a707-a58cf53a89cf_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>You&#8217;ve probably seen these phrases thrown around in AI conversations:<br><br>&#8220;7 billion parameters.&#8221;<br>&#8220;GPT-3 has 175 billion parameters.&#8221;</p><p>But what does it actually mean?</p><p>Imagine you&#8217;re teaching a child to recognize animals. In the beginning, they might think every four-legged creature is a dog. But as they see more examples and get corrected they start noticing subtle differences. Maybe it&#8217;s the ears, the tail, the way it walks. Their brain adjusts. That&#8217;s exactly what parameters do for AI models.</p><p>Think of parameters as tiny knobs inside an AI model. Each knob controls how much weight the model gives to a certain pattern it has learned. During training, the AI tweaks these knobs - billions of them - to become better at whatever task it&#8217;s learning: predicting the next word, recognising an image, or even recommending a product.</p><p>More parameters mean the model has more &#8220;mental knobs&#8221; to fine-tune its understanding. That&#8217;s why bigger models can handle more complex tasks; they simply have more capacity to learn. But with size comes trade-offs, bigger models are harder to train, need more GPUs, and are expensive to run. So, bigger isn't always better it depends on the problem you're solving.</p><p>Parameters are where the learning lives in an AI model. More knobs, more learning power.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.nitinmadeshia.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Product Decisions! Subscribe for free to receive new posts</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>