Layers in a Neural Network explained - deeplizard

文章推薦指數: 80 %
投票人數:10人

Layers of a neural network · Dense (or fully connected) layers · Convolutional layers · Pooling layers · Recurrent layers · Normalization layers ... DeepLearningFundamentals-ClassicEdition Anewerversionofthiscourseisavailable!Checkherefordetails! LayersinaNeuralNetworkexplained Layersofaneuralnetwork Inthispost,we'llbeworkingtobetterunderstandthelayerswithinanartificialneuralnetwork.We'llalsoseehowtoaddlayerstoasequentialmodelinKeras. Inthe lastpost,wesawhowtheneuronsinanANNareorganizedintolayers.Theexampleswelookedatshowedtheuseofdenselayers,whicharealsoknownasfullyconnectedlayers.Thereare,however,different typesoflayers.Someexamplesinclude: Dense(orfullyconnected)layers Convolutionallayers Poolinglayers Recurrentlayers Normalizationlayers Whyhavedifferenttypesoflayers? Differentlayersperformdifferenttransformationsontheirinputs,andsomelayersarebettersuitedforsometasksthanothers. Forexample,aconvolutionallayerisusuallyusedinmodelsthataredoingworkwithimagedata.Recurrentlayersareusedinmodelsthataredoingworkwithtimeseriesdata,andfullyconnectedlayers,asthenamesuggests,fullyconnectseachinput toeachoutputwithinitslayer. Fornow,wewillkeepourfocusonlayersingeneral,andwe'lllearnmoreindepthaboutspecificlayertypesaswedescenddeeperintodeeplearning. Exampleartificialneuralnetwork Let'sconsiderthefollowingexampleANN: Wecanseethatthefirstlayer,theinputlayer,consistsofeightnodes.Eachoftheeightnodesinthislayerrepresentsanindividualfeaturefromagivensampleinourdataset. Thistellsusthatasinglesamplefromourdatasetconsistsofeightdimensions.Whenwechooseasamplefromourdatasetandpassthissampletothemodel,eachoftheeightvaluescontainedinthesamplewillbeprovidedtoacorrespondingnodein theinputlayer. Wecanseethateachoftheeightinputnodesareconnectedtoeverynodeinthenextlayer. Eachconnectionbetweenthefirstandsecondlayerstransferstheoutputfromthepreviousnodetotheinputofthereceivingnode(lefttoright).Thetwolayersinthemiddlethathavesixnodeseacharehiddenlayerssimplybecausetheyarepositioned betweentheinputandoutputlayers. Layerweights Eachconnectionbetweentwonodeshasanassociatedweight,whichisjustanumber. Eachweightrepresentsthestrengthoftheconnectionbetweenthetwonodes.Whenthenetworkreceivesaninputatagivennodeintheinputlayer,thisinputispassedtothenextnodeviaaconnection,andtheinputwillbemultipliedbytheweight assignedtothatconnection. Foreachnodeinthesecondlayer,aweightedsumisthencomputedwitheachoftheincomingconnections.Thissumisthenpassedtoanactivationfunction,whichperformssometypeoftransformationonthegivensum.Forexample,anactivationfunction maytransformthesumtobeanumberbetweenzeroandone.Theactualtransformationwillvarydependingonwhichactivationfunctionisused.Moreonactivationfunctions later. nodeoutput=activation(weightedsumofinputs) Forwardpassthroughaneuralnetwork Onceweobtaintheoutputforagivennode,theobtainedoutputisthevaluethatispassedasinputtothenodesinthenextlayer. Thisprocesscontinuesuntiltheoutputlayerisreached.Thenumberofnodesintheoutputlayerdependsonthenumberofpossibleoutputorpredictionclasseswehave.Inourexample,wehavefourpossiblepredictionclasses. Supposeourmodelwastaskedwithclassifyingfourtypesofanimals.Eachnodeintheoutputlayerwouldrepresentoneoffourpossibilities.Forexample,wecouldhavecat,dog,llamaorlizard.Thecategoriesorclassesdependonhowmanyclassesare inourdataset. Foragivensamplefromthedataset,theentireprocessfrominputlayertooutputlayeriscalledaforwardpassthroughthenetwork. Findingtheoptimalweights Asthemodellearns,theweightsatallconnectionsareupdatedandoptimizedsothattheinputdatapointmapstothecorrectoutputpredictionclass.Moreonthisoptimizationprocessaswego deeperintodeeplearning. ThisgivesusageneralintroductoryunderstandingabouthowlayersareworkinginsideANNs.Let'snowseehowourmodelcanbeexpressedusingcodeinKeras. DefiningtheneuralnetworkincodewithKeras Inourpreviousdiscussion,wesawhowtouseKerastobuildasequentialmodel.Now,let'sdothisforourexamplenetwork. WillstartoutbydefininganlistofDenseobjects,ourlayers.Thislistwillthenbepassedtotheconstructorofthesequentialmodel. Rememberournetworklookslikethis: Giventhis,wehave: layers=[ Dense(units=6,input_shape=(8,),activation='relu'), Dense(units=6,activation='relu'), Dense(units=4,activation='softmax') ] NoticehowthefirstDenseobjectspecifiedinthelistisnottheinputlayer.ThefirstDenseobjectisthefirsthiddenlayer.Theinputlayerisspecifiedasaparameterto thefirstDenseobject'sconstructor. Ourinputshapeiseight.Thisiswhyourinputshapeisspecifiedasinput_shape=(8,).Ourfirsthiddenlayerhassixnodesasdoesoursecondhiddenlayer,andouroutputlayerhasfournodes. Fornow,justnotethatweareusinganactivationfunctioncalledreluactivation='relu'forbothofourhiddenlayersandanactivationfunctioncalledsoftmaxactivation='softmax'forouroutputlayer.We'llcoverthesefunctionsinmoredetailinour nextpostonactivationfunctions. Ourfinalproductlookslikethis: fromkeras.modelsimportSequential fromkeras.layersimportDense,Activation layers=[ Dense(units=6,input_shape=(8,),activation='relu'), Dense(units=6,activation='relu'), Dense(units=4,activation='softmax') ] model=Sequential(layers) ThisishowourmodelcanbeexpressedincodeusingKeras.Hopefullynowyouhaveageneralunderstandingaboutwhatlayersareinaneuralnetwork,andhowtheyarefunctioning.I'llseeyainthenextone! Submit DEEPLIZARD Message notifications QuizResults Inthisvideo,weexplaintheconceptoflayersinaneuralnetworkandshowhowtocreateandspecifylayersincodewithKeras. 🕒🦎VIDEOSECTIONS🦎🕒 00:00WelcometoDEEPLIZARD-Gotodeeplizard.comforlearningresources 00:30Helpdeeplizardaddvideotimestamps-Seeexampleinthedescription 05:46CollectiveIntelligenceandtheDEEPLIZARDHIVEMIND 💥🦎DEEPLIZARDCOMMUNITYRESOURCES🦎💥 👋Hey,we'reChrisandMandy,thecreatorsofdeeplizard! 👀CHECKOUTOURVLOG: 🔗https://youtube.com/deeplizardvlog 💻DOWNLOADACCESSTOCODEFILES 🤖Availableformembersofthedeeplizardhivemind: 🔗https://deeplizard.com/resources ❤️🦎Specialthankstothefollowingpolymathsofthedeeplizardhivemind: Tammy BufferUnderrun ManoPrime 👀Followdeeplizard: Ourvlog:https://youtube.com/deeplizardvlog Facebook:https://facebook.com/deeplizard Instagram:https://instagram.com/deeplizard Twitter:https://twitter.com/deeplizard Patreon:https://patreon.com/deeplizard YouTube:https://youtube.com/deeplizard 🎓DeepLearningwithdeeplizard: DeepLearningDictionary-https://deeplizard.com/course/ddcpailzrd DeepLearningFundamentals-https://deeplizard.com/course/dlcpailzrd LearnTensorFlow-https://deeplizard.com/learn/video/RznKVRTFkBY LearnPyTorch-https://deeplizard.com/learn/video/v5cngxo4mIg ReinforcementLearning-https://deeplizard.com/learn/video/nyjbcRQ-uQ8 GenerativeAdversarialNetworks-https://deeplizard.com/course/gacpailzrd 🎓OtherCourses: DataScience-https://deeplizard.com/learn/video/d11chG7Z-xk Trading-https://deeplizard.com/learn/video/ZpfCK_uHL9Y 🛒CheckoutproductsdeeplizardrecommendsonAmazon: 🔗https://amazon.com/shop/deeplizard 📕GetaFREE30-dayAudibletrialand2FREEaudiobooksusingdeeplizard'slink: 🔗https://amzn.to/2yoqWRn 🎵deeplizardusesmusicbyKevinMacLeod 🔗https://youtube.com/channel/UCSZXFhRIx6b0dFX3xS8L1yQ 🔗http://incompetech.com/ ❤️Pleaseusetheknowledgegainedfromdeeplizardcontentforgood,notevil. DEEPLIZARD Message notifications Updatehistoryforthispage Didyouknowyouthatdeeplizardcontentisregularlyupdatedandmaintained? Updated Maintained Spotsomethingthatneedstobeupdated?Don'thesitatetoletusknow.We'llfixit! Allrelevantupdatesforthecontentonthispagearelistedbelow. more_horiz Committedbyon Previous Next FollowDeeplizard Vlog Facebook Instagram Twitter YouTube Patreon Hivemind Shop CourseLessons DeepLearningplaylistoverview&MachineLearningintro DeepLearningexplained ArtificialNeuralNetworksexplained LayersinaNeuralNetworkexplained ActivationFunctionsinaNeuralNetworkexplained TrainingaNeuralNetworkexplained HowaNeuralNetworkLearnsexplained LossinaNeuralNetworkexplained LearningRateinaNeuralNetworkexplained Train,Test,&ValidationSetsexplained PredictingwithaNeuralNetworkexplained OverfittinginaNeuralNetworkexplained UnderfittinginaNeuralNetworkexplained SupervisedLearningexplained UnsupervisedLearningexplained Semi-supervisedLearningexplained DataAugmentationexplained One-hotEncodingexplained ConvolutionalNeuralNetworks(CNNs)explained VisualizingConvolutionalFiltersfromaCNN ZeroPaddinginConvolutionalNeuralNetworksexplained MaxPoolinginConvolutionalNeuralNetworksexplained Backpropagationexplained|Part1-Theintuition Backpropagationexplained|Part2-Themathematicalnotation Backpropagationexplained|Part3-Mathematicalobservations Backpropagationexplained|Part4-Calculatingthegradient Backpropagationexplained|Part5-Whatputsthe"back"inbackprop? Vanishing&ExplodingGradientexplained|Aproblemresultingfrombackpropagation WeightInitializationexplained|Awaytoreducethevanishinggradientproblem BiasinanArtificialNeuralNetworkexplained|Howbiasimpactstraining LearnableParametersinanArtificialNeuralNetworkexplained LearnableParametersinaConvolutionalNeuralNetwork(CNN)explained RegularizationinaNeuralNetworkexplained BatchSizeinaNeuralNetworkexplained Fine-tuningaNeuralNetworkexplained BatchNormalization(“batchnorm”)explained



請為這篇文章評分?