Introduction to Neural Network - Analytics Vidhya
文章推薦指數: 80 %
1. What are Layers in a Neural Network? ... Input Layer– First is the input layer. This layer ... D H M S × Home DeepLearning101:BeginnersGuidetoNeuralNetwork Facebook Twitter Linkedin Youtube HS13—March1,2021 Advanced DeepLearning Maths Videos Introduction IfthereisoneareaindatasciencethathasledtothegrowthofMachineLearningandArtificialIntelligenceinthelastfewyears,itisDeepLearning.Fromresearchlabsinuniversitieswithlowsuccessintheindustrytopoweringeverysmartdeviceontheplanet–DeepLearningandNeuralNetworkshavestartedarevolution. Note:IfyouaremoreinterestedinlearningconceptsinanAudio-Visualformat,Wehavethisentirearticleexplainedinthevideobelow.Ifnot,youmaycontinuereading. Inthisarticle,wewillbeintroducingyoutothecomponentsofneuralnetworks. BuildingBlocksofaNeuralNetwork:LayersandNeurons- TherearetwobuildingblocksofaNeuralNetwork,let’slookateachoneofthemindetail- 1.WhatareLayersinaNeuralNetwork? AneuralnetworkismadeupofverticallystackedcomponentscalledLayers.Eachdottedlineintheimagerepresentsalayer.TherearethreetypesoflayersinaNN- InputLayer–Firstistheinputlayer.Thislayerwillacceptthedataandpassittotherestofthenetwork. HiddenLayer–Thesecondtypeoflayeriscalledthehiddenlayer.Hiddenlayersareeitheroneormoreinnumberforaneuralnetwork.Intheabovecase,thenumberis1.Hiddenlayersaretheonesthatareactuallyresponsiblefortheexcellentperformanceandcomplexityofneuralnetworks.Theyperformmultiplefunctionsatthesametimesuchasdatatransformation,automaticfeaturecreation,etc. Outputlayer–Thelasttypeoflayeristheoutputlayer.Theoutputlayerholdstheresultortheoutputoftheproblem.Rawimagesgetpassedtotheinputlayerandwereceiveoutputintheoutputlayer.Forexample- Inthiscase,weareprovidinganimageofavehicleandthisoutputlayerwillprovideanoutputwhetheritisanemergencyornon-emergencyvehicle,afterpassingthroughtheinputandhiddenlayersofcourse. Now,thatweknowaboutlayersandtheirfunctionlet’stalkindetailaboutwhateachoftheselayersismadeupof. 2.WhatareNeuronsinaNeuralNetwork? Alayerconsistsofsmallindividualunitscalledneurons. Aneuroninaneuralnetworkcanbebetterunderstoodwiththehelpofbiologicalneurons.Anartificialneuronissimilartoabiologicalneuron.Itreceivesinputfromtheotherneurons,performssomeprocessing,andproducesanoutput. Nowlet’sseeanartificialneuron- Here,X1andX2areinputstotheartificialneurons,f(X)representstheprocessingdoneontheinputsandyrepresentstheoutputoftheneuron. WhatisaFiringofaneuron? Inreallife,weallhaveheardthephrase-“Fireupthoseneurons”inoneformoranother.Thesameappliestoartificialneuronsaswell.Everyneuronhasatendencytofirebutonlyincertainconditions.Forexample- Ifwerepresentthisf(X)byadditionthenthisneuronmayfirewhenthesumisgreaterthan,say100.Whiletheremaybeacasewheretheotherneuronmayfirewhenthesumisgreaterthan10- ThesecertainconditionswhichdifferneurontoneuronarecalledThreshold.Forexample,iftheinputX1intothefirstneuronis30andX2is0: Thisneuronwillnotfire,sincethesum30+0=30isnotgreaterthanthethresholdi.e100.Whereasiftheinputhadremainedthesamefortheotherneuronthenthisneuronwouldhavefiredsincethesumof30isgreaterthanthethresholdof10. Now,thenegativethresholdiscalledtheBiasofaneuron.Letusrepresentthisabitmathematically.Sowecanrepresentthefiringandnon-firingconditionofaneuronusingthesecoupleofequations- Ifthesumoftheinputsisgreaterthanthethresholdthentheneuronwillfire.Otherwise,theneuronwillnotfire.Let’ssimplifythisequationabitandbringthethresholdtotheleftsideoftheequations.Now,thisnegativethresholdiscalledBias- Onethingtonoteisthatinanartificialneuralnetwork,alltheneuronsinalayerhavethesamebias.Nowthatwehaveagoodunderstandingofbiasandhowitrepresentstheconditionforaneurontofire,let’smovetoanotheraspectofanartificialneuroncalledWeights. Sofareveninourcalculation,wehaveassignedequalimportancetoalltheinputs.Forexample- HereX1hasaweightof1and X2hasaweightof1andthebiashasaweightof1butwhatifwewanttohavedifferentweightsattachedtodifferentinputs? Let’shavealookatanexampletounderstandthisbetter.SupposetodayisacollegepartyandyouhavetodecidewhetheryoushouldgotothepartyornotbasedonsomeinputconditionssuchasIstheweathergood?Isthevenuenear? Isyourcrushcoming? So,iftheweatherisgoodthenitwillbepresentedwithavalueof1,otherwise0.Similarly,ifthevenueisnearitwillberepresentedby1,otherwise0.Andsimilarlyforwhetheryourcrushiscomingtothepartyornot. Nowsupposebeingacollegeteenager,youabsolutelyadoreyourcrushandyoucangotoanylengthstoseehimorher.Soyouwilldefinitelygotothepartynomatterhowtheweatherisorhowfarthevenueis,thenyouwillwanttoassignmoreweighttoX3whichrepresentsthecrushincomparisontotheothertwoinputs. Suchasituationcanberepresentedifweassignweightstoaninputsuchasthis- Wecanassignaweightof3totheweather,aweightof2tothevenue,andaweightof6tothecrush.Nowifthesumofallthesethreefactorsthatisweather,venue,andcrushisgreaterthanathresholdof5,thenyoucandecidetogotothepartyotherwisenot. Note:X0isthebiasvalue Soforexample,wehavetakeninitiallytheconditionwherecrushismoreimportantthantheweatherorthevenueitself. Solet’ssayforexample,aswerepresentedheretheweather(X1)isbadrepresentedby0andthevenue(X2)isfaroffrepresentedby0butyourcrush(X3)iscomingtothepartywhichisrepresentedby1,sowhenyoucalculatethesumaftermultiplyingthevaluesofXswiththeirrespectiveweights,wegetasumof0forWeather(X1),0forVenue(X2)and6forCrush(X3).Since6isgreaterthanthethresholdof5,youwilldecidetogototheparty.Hencetheoutput(y)is1. Let’simagineadifferentscenarionow.Imagineyou’resicktodayandnomatterwhatyouwillnotattendthepartythenthissituationcanberepresentedbyassigningequalweighttoweather,venue,andcrushwiththethresholdof4. Now,inthiscasewearechangingthevalueofthethresholdandsettingittoavalueof4soeveniftheweatherisgood,thevenueisnearandyourcrushiscoming,youwon’tbegoingtothepartysincethesumi.e1+1+1equalto3,islessthanthethresholdvalueof4. Thisw0,w1,w2,andw3arecalledtheweightsofneuronsandaredifferentfordifferentneurons.Theseweightsaretheonesthataneuralnetworkhastolearntomakegooddecisions. ActivationFunctionsinaNeuralNetwork Nowthatweknowhowaneuralnetworkcombinesdifferentinputsusingweights,let’smovetothelastaspectofaneuroncalledtheActivationfunctions.Sofarwhatwehavebeendoingissimplyaddingsomeweightedinputsandcalculatingsomeoutputandthisoutputcanreadfromminusinfinitytoinfinity. Butthiscanbechallengedinmanycircumstances.Assumewefirstwanttoestimatetheageofapersonfromhisheight,weight,andcholesterollevelandthenclassifythepersonasoldornot,basedoniftheageisgreaterthan60. Nowifweusethisgivenneuronthentheageof-20isevenpossible.Youknowthattherangeofageaccordingtothecurrentstructureofthisneuronwillrangefrom-∞to∞.Soeventheageofsomeoneas-20ispossible,giventhisabsurdrangeforagewecanstilluseourconditiontodecidewhetherapersonisoldornot.Forexample,ifwehavesaidacertaincriterionsuchasapersonisoldonlyiftheageisgreaterthan60.Soeveniftheagecomesouttobe-20wecanusethiscriteriontoclassifythepersonasnotold. Butitwouldhavebeenmuchbetterhadtheagemademuchmoresensesuchasiftheoutputofthisneuronwhichrepresentstheagehadbeenintherangeoflet’ssay0to120.So,howcanwesolvethisproblemwhentheoutputofaneuronisnotinaparticularrange? Onemethodistocliptheageonthenegativesidewouldbetouseafunctionsuchasmax(0,X). Nowlet’sfirstnotetheoriginalcondition,beforeusinganyfunction.ForthepositiveX,wehadapositiveY,andfornegativeXwehadanegativeY.Herex-axisrepresentstheactualvaluesandyrepresentsthetransformedvalues- Butnowifyouwanttogetridofthenegativevalueswhatwecandoisuseafunctionlikemax(0,X).Usingthisfunctionanythingwhichisonthenegativesideofthex-axisgetsclippedto0. ThistypeoffunctioniscalledaReLUfunctionandtheseclassesoffunctions,whichtransformthecombinedinputarecalledActivationfunctions.So,ReLUisanactivationfunction. Dependingonthetypeoftransformationneededtherecanbedifferentkindsofactivationfunctions.Let’shavealookatsomeofthepopularactivationfunctions- Sigmoidactivationfunction–Thisfunctiontransformstherangeofcombinedinputstoarangebetween0and1.Forexample,iftheoutputisfromminusinfinitytoinfinitywhichisrepresentedbythex-axis,thesigmoidfunctionwillrestrictthisinfiniterangetoavaluebetween0&1. Tanhactivationfunction-Thisfunctiontransformstherangeofcombinedinputstoarangebetween-1and1.Tanhlooksverysimilartotheshapeofthesigmoidbutitrestrictstherangebetween-1and1. Differentactivationfunctionsperformdifferentlyondifferentdatadistribution.Sosometimesyouhavetotryandcheckdifferentactivationfunctionsandfindoutwhichworksbetterforaparticularproblem. EndNotes Sofar,wehavediscussedthattheneuralnetworkiscomposedofdifferenttypesoflayersstackedtogetherandeachoftheselayersiscomposedofindividualunitscalledNeurons.Everyneuronhasthreeproperties:firstisbiased,secondisweightandthirdistheactivationfunction. Further,biasisthenegativethresholdafterwhichyouwanttheneurontofire.Weightishowyoudefinewhichinputismoreimportanttotheothers.Theactivationfunctionhelpstotransformthecombinedweightedinputtoarrangeaccordingtotheneedathand. IhighlyrecommendyoucheckoutourCertifiedAI&MLBlackBeltPlusProgramtobeginyourjourneyintothefascinatingworldofdatascienceandlearntheseandmanymoretopics. Ihopethisarticleworksasastartingpointtoyourlearningtowardsneuralnetworksanddeeplearning. Reachouttousinthecommentsbelowincaseyouhaveanydoubts. Related Neuralnetwork Tableofcontents AbouttheAuthor HS13 OurTopAuthors viewmore Download AnalyticsVidhyaAppfortheLatestblog/Article PreviousPost IncrementalandReinforcedlearningforImageclassification NextPost CreateDualAxisChartsinTableau 4thoughtson"DeepLearning101:BeginnersGuidetoNeuralNetwork" StephenHobbssays: March01,2021at8:29pm Excellentarticle!Isentittotwosonsandonefriend.Ilearnedaboutactivationfunctions,whichIdidnotunderstandbeforethearticle. Nextarticle:Optimizers(adam,etc.).Letmeknowwhenyoupublishit.Reply HimanshiSinghsays: March03,2021at4:14pm Thanksalot,Stephen!I'llsurelytrytowriteaboutoptimizersverysoon:)Reply Aagusthyashankersays: December19,2021at3:58pm iamactuallyfrommathbackground,andicanonlyunderstandifyoushowmethemathandhowitworksusingmath.yourarticlewastrulyanexcellentwork.Reply HimanshiSinghsays: February10,2022at10:40am ThanksAagusthya!Reply LeaveaReplyYouremailaddresswillnotbepublished.Requiredfieldsaremarked*Cancelreply Notifymeoffollow-upcommentsbyemail.Notifymeofnewpostsbyemail.Submit Δ TopResources PythonTutorial:WorkingwithCSVfileforDataScience HarikaBonthu- Aug21,2021 25QuestionstotestaDataScientistonSupportVector.. 1201904- Oct05,2017 40QuestionstotestaDataScientistonClusteringTechniques.. sauravkaushik8- Feb05,2017 30QuestionstotestadatascientistonK-NearestNeighbors.. sunil- Sep04,2017 × WeusecookiesonAnalyticsVidhyawebsitestodeliverourservices,analyzewebtraffic,andimproveyourexperienceonthesite.ByusingAnalyticsVidhya,youagreetoourPrivacyPolicyandTermsofUse.AcceptPrivacy&CookiesPolicy Close PrivacyOverview Thiswebsiteusescookiestoimproveyourexperiencewhileyounavigatethroughthewebsite.Outofthese,thecookiesthatarecategorizedasnecessaryarestoredonyourbrowserastheyareessentialfortheworkingofbasicfunctionalitiesofthewebsite.Wealsousethird-partycookiesthathelpusanalyzeandunderstandhowyouusethiswebsite.Thesecookieswillbestoredinyourbrowseronlywithyourconsent.Youalsohavetheoptiontoopt-outofthesecookies.Butoptingoutofsomeofthesecookiesmayaffectyourbrowsingexperience. Necessary Necessary AlwaysEnabled Necessarycookiesareabsolutelyessentialforthewebsitetofunctionproperly.Thiscategoryonlyincludescookiesthatensuresbasicfunctionalitiesandsecurityfeaturesofthewebsite.Thesecookiesdonotstoreanypersonalinformation. Non-necessary Non-necessary Anycookiesthatmaynotbeparticularlynecessaryforthewebsitetofunctionandisusedspecificallytocollectuserpersonaldataviaanalytics,ads,otherembeddedcontentsaretermedasnon-necessarycookies.Itismandatorytoprocureuserconsentpriortorunningthesecookiesonyourwebsite. SAVE&ACCEPT ×
延伸文章資訊
- 1Artificial Neural Network - an overview | ScienceDirect Topics
Processing elements (also known as either a neurode or perceptron) are connected to other process...
- 2Artificial neural network - Wikipedia
An artificial neural network is an interconnected group of nodes, inspired by a simplification of...
- 31. NEURAL NETWORK ARRANGED IN LAYERS. The first layer is called ...
- 4How to Configure the Number of Layers and Nodes in a ...
A single-layer artificial neural network, also called a single-layer, has a single layer of nodes...
- 5Artificial neural network. There are three layers; an input layer