<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[TechyGeek]]></title><description><![CDATA[TechyGeek]]></description><link>https://techygeek.xyz</link><generator>RSS for Node</generator><lastBuildDate>Mon, 20 Apr 2026 10:01:00 GMT</lastBuildDate><atom:link href="https://techygeek.xyz/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Artificial Neuron Network]]></title><description><![CDATA[What is ANN?
As we read in last article what is Perceptron is (an artificial Neuron) when all Neuron are connect together forming (group of 2 or more) are known as ANN. In ANN there are multiple layer of hiden layer ,and one output layer.
What are co...]]></description><link>https://techygeek.xyz/artificial-neuron-network</link><guid isPermaLink="true">https://techygeek.xyz/artificial-neuron-network</guid><category><![CDATA[Deep Learning]]></category><category><![CDATA[Machine Learning]]></category><category><![CDATA[ANN]]></category><dc:creator><![CDATA[Kalpesh Patil]]></dc:creator><pubDate>Sat, 02 Aug 2025 11:52:17 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1754119222430/684dcff2-aec6-4a6d-91f4-b69b009b825d.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-what-is-ann">What is ANN?</h2>
<p>As we read in last article what is Perceptron is (an artificial Neuron) when all Neuron are connect together forming (group of 2 or more) are known as ANN. In ANN there are multiple layer of hiden layer ,and one output layer.</p>
<h3 id="heading-what-are-component-of-ann">What are component of ANN?</h3>
<p>In ANN :</p>
<p>The first layer is Input Layer where all the input are taken from the second layer to output layer are known as Hidden layer where all the Matrix calculation are done .Then there is Forward propogation where all the weights are updated and there is the main algorithm comes which known as BackPropogation in this algorithms all the weights are updated to make the the accuracy of model more and more.</p>
<p>There are diffrent type of ANN are:</p>
<ol>
<li><p><strong>Feedforward Neural Network(FNN)</strong></p>
<p> This is the most basic type of ANN where the information moves in on direction , from the input layer ,through the hidden layer , to the out put layer .there are no loops or cycle in this type.It is Mostly used in Regression and classification model.</p>
</li>
<li><p><strong>Convolutional Neural Network(CNN)</strong></p>
<p> CNNs are specialy designed for image processing and computer vision task.They use Cconvolutional Layers which he that helps in automatically detecting features like features like edges ,shapes,and texture from images .CNNs are the blackbone of modern technologies like Face recognition and object detection.</p>
</li>
<li><p><strong>Recurrent Neural Network(RNN)</strong></p>
<p> RNNs are use for sequential data like series, speech , and text. Unlike feedforward networks ,RNNs have loops which remember the previous outputs. This memory like structure iss helpful in language modelling and prediction tasks.</p>
</li>
</ol>
<h3 id="heading-how-does-ann-learns">How does ANN learns?</h3>
<p>The main goal of any ANN is to reduce the error between the actual output and predicted value.This learning process happens through:</p>
<ul>
<li><p><strong>Forward Propogation:</strong></p>
<p>  In this process the input is passed through the no. of layer of the network ,and prediction are made through based o current weights and biases.</p>
</li>
<li><p><strong>Loss Function:</strong></p>
<p>  After Prediction , the error is calculated using loss function (like MSe,RMSE etc).This tells us how far our predicted values are from real values.</p>
</li>
<li><p><strong>Backpropogation:</strong></p>
<p>  This is the main learning process of an ANN .The calculated error is propogated back through network and all the weights are upgraded through optimization techniques like Gradient descent .the aim of us is to minimize the error through this process .</p>
</li>
</ul>
<h3 id="heading-important-terms-to-know">Important Terms to Know</h3>
<ul>
<li><p><strong>Weights</strong> – These are the adjustable parameters in a neural network that influence the strength of connections between neurons.</p>
</li>
<li><p><strong>Bias</strong> – An extra parameter that helps the model shift the activation function.</p>
</li>
<li><p><strong>Activation Function</strong> – Functions like ReLU, Sigmoid, or Tanh which decide whether a neuron should activate or not.</p>
</li>
<li><p><strong>Epoch</strong> – One complete pass of the full training dataset through the network.</p>
</li>
<li><p><strong>Learning Rate</strong> – A small value that controls how much the weights should be updated in each step.</p>
</li>
</ul>
<p>So as we know what is ANN is and its type in next article we are gonna learn how to create a model in next article .</p>
<h3 id="heading-summary">Summary</h3>
<blockquote>
<p>An Artificial Neural Network (ANN) comprises interconnected neurons that form a model with input, hidden, and output layers. Key types include Feedforward Neural Networks (FNN) for regression and classification, Convolutional Neural Networks (CNN) for image processing, and Recurrent Neural Networks (RNN) for sequential data. ANN learns by minimizing error through forward propagation, a loss function, and backpropagation. Important concepts include weights, bias, activation functions, epoch, and learning rate.</p>
</blockquote>
]]></content:encoded></item><item><title><![CDATA[Deep Learning: A Beginner's Guide]]></title><description><![CDATA[History of Deep learning
Before learning about Deep Learning we should now the history behind it ,that with how much effort this thing has changed its course from nothing to every thing.

In 1940s-1950s: The Birth of Neural Network

1943-McCulloch & ...]]></description><link>https://techygeek.xyz/deep-learning-a-beginners-guide</link><guid isPermaLink="true">https://techygeek.xyz/deep-learning-a-beginners-guide</guid><category><![CDATA[Deep Learning]]></category><category><![CDATA[intro to programming]]></category><category><![CDATA[Machine Learning]]></category><category><![CDATA[Perceptron]]></category><category><![CDATA[history]]></category><category><![CDATA[neural networks]]></category><dc:creator><![CDATA[Kalpesh Patil]]></dc:creator><pubDate>Wed, 30 Jul 2025 16:32:24 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1753893248824/fe5bb790-7112-4c3a-8403-a906ce0553f1.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-history-of-deep-learning">History of Deep learning</h2>
<p>Before learning about Deep Learning we should now the history behind it ,that with how much effort this thing has changed its course from nothing to every thing.</p>
<ul>
<li><p>I<strong>n 1940s-1950s: The Birth of Neural Network</strong></p>
</li>
<li><p>1943-McCulloch &amp; Pitts proposed the first simplified computaional model of Neuron</p>
</li>
<li><p>1958 - Frank Rosenblatt introduced the first perceptron, and also the first program was binary classification.</p>
</li>
</ul>
<ul>
<li><p><strong>In 1960s-1980s:First Winter</strong></p>
</li>
<li><p>1969 - Minsky &amp; Papert showed that Peerceptron can’t solve XOR(simple problem)</p>
</li>
<li><p>Al and Neural Network fell out of favor as this reason was creating problem and it fell for a decade ,killing all the hype.This is called the first Winter.</p>
</li>
</ul>
<ul>
<li><p><strong>In 1986-1990s:Backpropogation</strong></p>
</li>
<li><p>1986 - Rumelhart ,Hinton ,&amp; Williams popularized backpropogation , allowing multi-layer network(MLP) to be trained.</p>
</li>
<li><p>1989-1998- Yann LeCun created LeNet ,a CNN used for text recognation in checks.</p>
</li>
<li><p>These all program were good and gave really good output but there wasn’t good large data set and Compute problem.</p>
</li>
</ul>
<ul>
<li><p><strong>2000s:Data , GPUs,and Patience</strong></p>
</li>
<li><p>More data +better Hardware especially GPUs which were outstanding</p>
</li>
<li><p>reasercher kept pushing deep nets while the mainstream focused on SVMs and decision trees.</p>
</li>
</ul>
<ul>
<li><p><strong>2012 : The Breakthrough</strong></p>
</li>
<li><p>AlexNet(Krizhevsky,Sutskevar,Hinton ) Won the ImageNet Compitition , breaking the traditional method by huge margin .</p>
</li>
<li><p>This was the main breakthrough as CNN was recognized by many researcher.</p>
</li>
</ul>
<ul>
<li><p><strong>2013-2018:The DeepLearning Boom</strong></p>
</li>
<li><p>2014- GAN(genrative Adversial Networks) introduced by Ian Goodfellow</p>
</li>
<li><p>2015 - ResNet won ImageNet with 152 networks, enabling deeper model</p>
</li>
<li><p>2016 - Transformer model were introduced powering model like GPT, Bert.</p>
</li>
</ul>
<p>I think This Much history is enough to know why and how it cam from nothing to everything .I also dont like making the article this much so ill only concluded what is import in all this series.</p>
<p>DeepLearning is an really important an facinating Technology ,So i hope You likedd the History about it.</p>
<p>Now lets Move ahead:</p>
<h2 id="heading-what-is-neuron">What is Neuron?</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1753891577059/8c187bc8-7dfd-4392-bea4-dbe2156471de.jpeg" alt class="image--center mx-auto" /></p>
<p>Before Knowing this we should know that Perceptron (Artificial Neuron) is developed by researching Human Brain.How a human brain function ,there are so many electrical impulses continuely going on inside our brain .By taking node a perceptron was created and inside it Calculation are going on .</p>
<p>Different Parts in Neuron are :</p>
<ul>
<li><p>Nucleus</p>
</li>
<li><p>axon</p>
</li>
<li><p>dendrite</p>
</li>
<li><p>synopses</p>
</li>
</ul>
<p>here synopses are the impulses between diff rent neuron.Reasercher figured out that when multiple neuron works together it creates reaction .</p>
<p>So We Know what is Neuron is fundamental block of human brain.</p>
<h3 id="heading-what-is-perceptron">What is perceptron?</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1753892579120/fe8f479a-51ab-48a2-a353-2d17b9a81aed.webp" alt class="image--center mx-auto" /></p>
<p>As we told above that perceptron is derived from biological neuron.</p>
<p>Perceptron can be single or MLP(multi layer Perceptron) where multiplle neuron are connected with each other.The connection are Weights.</p>
<p>It contain Weights W,and one Bias value (B) .Weights tells us the importance of its input how much value it carries.</p>
<p>$$Z=w.x+b$$</p><p>Here Z=output</p>
<p>w=weights</p>
<p>x=input feature</p>
<p>b=bias</p>
<p>Here the equation is then go through a Activation Function ,Now you will ask what is Activation Function it is mathematical function which on the basis of input value sets output there are different type of function i.e sigmoid,ReLU,softmax and many more which we learn in next articles</p>
<p>So thats for today we continue in next article</p>
<h2 id="heading-couclusion">Couclusion:</h2>
<blockquote>
<p>The article traces the evolution of deep learning from its inception in the 1940s with the creation of the first neural network models, through the challenges and breakthroughs over the decades, to the present day. It highlights key developments such as the introduction of the perceptron, the popularization of backpropagation, and significant advancements like AlexNet and GANs. The text emphasizes the importance of understanding the fundamental concepts of neural networks, including neurons and perceptrons, and their biological inspirations.</p>
</blockquote>
]]></content:encoded></item><item><title><![CDATA[This is my first Blog]]></title><description><![CDATA[Intro
My name is kalpesh Patil currently doing my B.E. in Information Technology.Currently I am learning About Machine Learning , Deep Learning and Gen Ai i love to learn about this thing .I created this blog for spreading the knowledge the one peopl...]]></description><link>https://techygeek.xyz/this-is-my-first-blog</link><guid isPermaLink="true">https://techygeek.xyz/this-is-my-first-blog</guid><dc:creator><![CDATA[Kalpesh Patil]]></dc:creator><pubDate>Wed, 30 Jul 2025 09:14:56 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/SymZoeE8quA/upload/4d11dbfdfd36152bc67d065a1c88b10e.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-intro">Intro</h2>
<p>My name is kalpesh Patil currently doing my B.E. in Information Technology.Currently I am learning About Machine Learning , Deep Learning and Gen Ai i love to learn about this thing .I created this blog for spreading the knowledge the one people really need .</p>
<h2 id="heading-what-will-i-publish">What will I Publish</h2>
<p>As i said in my intro i am enthusiastic about AI so ill be simply upload about explaing topic and also new Technologies which i have learned .</p>
<ul>
<li><p>Machine learning</p>
</li>
<li><p>Deep Learning</p>
</li>
<li><p>Gen AI</p>
</li>
<li><p>Cloud</p>
</li>
<li><p>MLops</p>
</li>
</ul>
<h2 id="heading-why-i-am-explaining-if-everything">Why I am explaining if everything</h2>
<p>So this will enhance my Know ledge about this Technologies .</p>
<p>as vice man Said</p>
<blockquote>
<p>‘Learning Stuff«««Teaching Stuff’</p>
</blockquote>
<p>so thats it as its a blog and soon i’ll update my blog according to the experience.</p>
<h2 id="heading-summary-by-ai">Summary by AI</h2>
<blockquote>
<p>Kalpesh Patil, a B.E. student in Information Technology, is passionate about AI, including Machine Learning, Deep Learning, and General AI. This blog is a platform to share and expand his knowledge on these topics and related technologies such as Cloud and MLOps. Through writing, he aims to enhance his understanding while helping others learn.</p>
</blockquote>
]]></content:encoded></item></channel></rss>