First neural network for beginners explained (with code)

文章推薦指數: 80 %
投票人數:10人

Creating our own simple neural network. Let's create a neural network from scratch with Python (3.x in the example below). import numpy, random, ... HomeNotificationsListsStoriesWritePublishedinTowardsDataScienceFirstneuralnetworkforbeginnersexplained(withcode)UnderstandandcreateaPerceptronPhotobyClémentHonUnsplashSoyouwanttocreateyourfirstartificialneuralnetwork,orsimplydiscoverthissubject,buthavenoideawheretobegin?Followthisquickguidetounderstandallthesteps!Whatisaneuralnetwork?Basedonnature,neuralnetworksaretheusualrepresentationwemakeofthebrain:neuronsinterconnectedtootherneuronswhichformsanetwork.Asimpleinformationtransitsinalotofthembeforebecominganactualthing,like“movethehandtopickupthispencil”.Theoperationofacompleteneuralnetworkisstraightforward:oneentervariablesasinputs(forexampleanimageiftheneuralnetworkissupposedtotellwhatisonanimage),andaftersomecalculations,anoutputisreturned(followingthefirstexample,givinganimageofacatshouldreturntheword“cat”).Now,youshouldknowthatartificialneuralnetworkareusuallyputoncolumns,sothataneuronofthecolumnncanonlybeconnectedtoneuronsfromcolumnsn-1andn+1.Therearefewtypesofnetworksthatuseadifferentarchitecture,butwewillfocusonthesimplestfornow.So,wecanrepresentanartificialneuralnetworklikethat:Figure1—RepresentationofaneuralnetworkNeuralnetworkscanusuallybereadfromlefttoright.Here,thefirstlayeristhelayerinwhichinputsareentered.Thereare2internalslayers(calledhiddenlayers)thatdosomemath,andonelastlayerthatcontainsallthepossibleoutputs.Don’tbotherwiththe“+1”satthebottomofeverycolumns.Itissomethingcalled“bias”andwe’lltalkaboutthatlater.Bytheway,theterm“deeplearning”comesfromneuralnetworksthatcontainsseveralhiddenlayers,alsocalled“deepneuralnetworks”.TheFigure1canbeconsideredasone.Whatdoesaneurondo?Theoperationsdonebyeachneuronsareprettysimple:Figure2—OperationsdonebyaneuronFirst,itaddsupthevalueofeveryneuronsfromthepreviouscolumnitisconnectedto.OntheFigure2,thereare3inputs(x1,x2,x3)comingtotheneuron,so3neuronsofthepreviouscolumnareconnectedtoourneuron.Thisvalueismultiplied,beforebeingadded,byanothervariablecalled“weight”(w1,w2,w3)whichdeterminestheconnectionbetweenthetwoneurons.Eachconnectionofneuronshasitsownweight,andthosearetheonlyvaluesthatwillbemodifiedduringthelearningprocess.Moreover,abiasvaluemaybeaddedtothetotalvaluecalculated.Itisnotavaluecomingfromaspecificneuronandischosenbeforethelearningphase,butcanbeusefulforthenetwork.Afterallthosesummations,theneuronfinallyappliesafunctioncalled“activationfunction”totheobtainedvalue.Figure3—SigmoidfunctionTheso-calledactivationfunctionusuallyservestoturnthetotalvaluecalculatedbeforetoanumberbetween0and1(doneforexamplebyasigmoidfunctionshownbyFigure3).Otherfunctionexistandmaychangethelimitsofourfunction,butkeepsthesameaimoflimitingthevalue.That’sallaneurondoes!Takeallvaluesfromconnectedneuronsmultipliedbytheirrespectiveweight,addthem,andapplyanactivationfunction.Then,theneuronisreadytosenditsnewvaluetootherneurons.Aftereveryneuronsofacolumndidit,theneuralnetworkpassestothenextcolumn.Intheend,thelastvaluesobtainedshouldbeoneusabletodeterminethedesiredoutput.Nowthatweunderstandwhataneurondoes,wecouldpossiblycreateanynetworkwewant.However,thereareotheroperationstoimplementtomakeaneuralnetworklearn.Howdoesaneuralnetworklearn?Yep,creatingvariablesandmakingtheminteractwitheachotherisgreat,butthatisnotenoughtomakethewholeneuralnetworklearnbyitself.Weneedtopreparealotofdatatogivetoournetwork.Thosedataincludetheinputsandtheoutputexpectedfromtheneuralnetwork.Let’stakealookathowthelearningprocessworks:Firstofall,rememberthatwhenaninputisgiventotheneuralnetwork,itreturnsanoutput.Onthefirsttry,itcan’tgettherightoutputbyitsown(exceptwithluck)andthatiswhy,duringthelearningphase,everyinputscomewithitslabel,explainingwhatoutputtheneuralnetworkshouldhaveguessed.Ifthechoiceisthegoodone,actualparametersarekeptandthenextinputisgiven.However,iftheobtainedoutputdoesn’tmatchthelabel,weightsarechanged.Thosearetheonlyvariablesthatcanbechangedduringthelearningphase.Thisprocessmaybeimaginedasmultiplebuttons,thatareturnedintodifferentpossibilitieseverytimesaninputisn’tguessedcorrectly.Todeterminewhichweightisbettertomodify,aparticularprocess,called“backpropagation”isdone.Wewon’tlingertoomuchonthat,sincetheneuralnetworkwewillbuilddoesn’tusethisexactprocess,butitconsistsongoingbackontheneuralnetworkandinspecteveryconnectiontocheckhowtheoutputwouldbehaveaccordingtoachangeontheweight.Finally,thereisalastparametertoknowtobeabletocontrolthewaytheneuralnetworklearns:the“learningrate”.Thenamesaysitall,thisnewvaluedeterminesonwhatspeedtheneuralnetworkwilllearn,ormorespecificallyhowitwillmodifyaweight,littlebylittleorbybiggersteps.1isgenerallyagoodvalueforthatparameter.PerceptronOkay,weknowthebasics,let’scheckabouttheneuralnetworkwewillcreate.TheoneexplainedhereiscalledaPerceptronandisthefirstneuralnetworkevercreated.Itconsistson2neuronsintheinputscolumnand1neuronintheoutputcolumn.Thisconfigurationallowstocreateasimpleclassifiertodistinguish2groups.Tobetterunderstandthepossibilitiesandthelimitations,let’sseeaquickexample(whichdoesn’thavemuchinterestexcepttounderstand):Let’ssayyouwantyourneuralnetworktobeabletoreturnoutputsaccordingtotherulesofthe“inclusiveor”.Reminder:ifAistrueandBistrue,thenAorBistrue.ifAistrueandBisfalse,thenAorBistrue.ifAisfalseandBistrue,thenAorBistrue.ifAisfalseandBisfalse,thenAorBisfalse.Ifyoureplacethe“true”sby1andthe“false”sby0andputthe4possibilitiesaspointswithcoordinatesonaplan,thenyourealizethetwofinalgroups“false”and“true”maybeseparatedbyasingleline.ThisiswhataPerceptroncando.Ontheotherhand,ifwecheckthecaseofthe“exclusiveor”(inwhichthecase“trueortrue”(thepoint(1,1))isfalse),thenwecanseethatasimplelinecannotseparatethetwogroups,andaPerceptronisn’tabletodealwiththisproblem.So,thePerceptronisindeednotaveryefficientneuralnetwork,butitissimpletocreateandmaystillbeusefulasaclassifier.CreatingourownsimpleneuralnetworkLet’screateaneuralnetworkfromscratchwithPython(3.xintheexamplebelow).importnumpy,random,oslr=1#learningratebias=1#valueofbiasweights=[random.random(),random.random(),random.random()]#weightsgeneratedinalist(3weightsintotalfor2neuronsandthebias)Thebeginningoftheprogramjustdefineslibrariesandthevaluesoftheparameters,andcreatesalistwhichcontainsthevaluesoftheweightsthatwillbemodified(thosearegeneratedrandomly).defPerceptron(input1,input2,output):outputP=input1*weights[0]+input2*weights[1]+bias*weights[2]ifoutputP>0:#activationfunction(hereHeaviside)outputP=1else:outputP=0error=output–outputPweights[0]+=error*input1*lrweights[1]+=error*input2*lrweights[2]+=error*bias*lrHerewecreateafunctionwhichdefinestheworkoftheoutputneuron.Ittakes3parameters(the2valuesoftheneuronsandtheexpectedoutput).“outputP”isthevariablecorrespondingtotheoutputgivenbythePerceptron.Thenwecalculatetheerror,usedtomodifytheweightsofeveryconnectionstotheoutputneuronrightafter.foriinrange(50):Perceptron(1,1,1)#TrueortruePerceptron(1,0,1)#TrueorfalsePerceptron(0,1,1)#FalseortruePerceptron(0,0,0)#FalseorfalseWecreatealoopthatmakestheneuralnetworkrepeateverysituationseveraltimes.Thispartisthelearningphase.Thenumberofiterationischosenaccordingtotheprecisionwewant.However,beawarethattoomuchiterationscouldleadthenetworktoover-fitting,whichcausesittofocustoomuchonthetreatedexamples,soitcouldn’tgetarightoutputoncaseitdidn’tseeduringitslearningphase.However,ourcasehereisabitspecial,sincethereareonly4possibilities,andwegivetheneuralnetworkallofthemduringitslearningphase.APerceptronissupposedtogiveacorrectoutputwithouthavingeverseenthecaseitistreating.x=int(input())y=int(input())outputP=x*weights[0]+y*weights[1]+bias*weights[2]ifoutputP>0:#activationfunctionoutputP=1else:outputP=0print(x,"or",y,"is:",outputP)Finally,wecanasktheusertoenterhimselfthevaluestocheckifthePerceptronisworking.Thisisthetestingphase.TheactivationfunctionHeavisideisinterestingtouseinthiscase,sinceittakesbackallvaluestoexactly0or1,sincewearelookingforafalseortrueresult.Wecouldtrywithasigmoidfunctionandobtainadecimalnumberbetween0and1,normallyveryclosetooneofthoselimits.outputP=1/(1+numpy.exp(-outputP))#sigmoidfunctionWecouldalsosavetheweightsthattheneuralnetworkjustcalculatedinafile,touseitlaterwithoutmakinganotherlearningphase.Itisdoneforwaybiggerproject,inwhichthatphasecanlastdaysorweeks.That’sit!You’vedoneyourowncompleteneuralnetwork.Youcreatedit,madeitlearn,andcheckeditscapacities.YourPerceptroncannowbemodifiedtouseitonanotherproblem.Justchangethepointsgivenduringtheiterations,adjustthenumberofloopifyourcaseismorecomplex,andjustletyourPerceptrondotheclassification.Doyouwanttolist2typesoftreesinthenearestforestandbeabletodetermineifanewtreeistypeAorB?Chose2featuresthatcandissociatebothtypes(forexampleheightandwidth),andcreatesomepointsforthePerceptrontoplaceontheplan.Letitdeductawaytoseparatethe2groups,andenteranynewtree’spointtoknowwhichtypeitis.Youcouldlaterexpandyourknowledgeandseeaboutbiggeranddeeperneuralnetwork,thatareverypowerful!Therearemultipleaspectswedidn’ttreat,orjustenoughforyoutogetthebasics,sodon’thesitatetogofurther.Iwouldlovetowriteaboutmorecomplexneuralnetworkssostaytuned!Thanksforreading!Ihopethislittleguidewasuseful,ifyouhaveanyquestionand/orsuggestion,letmeknowinthecomments.MorefromTowardsDataScienceFollowYourhomefordatascience.AMediumpublicationsharingconcepts,ideasandcodes.ReadmorefromTowardsDataScienceMorefromMediumThePathTowardsModernCloudDataWarehousingwithSnowflakeVisualizingthe2018KeralaFloodDonationsTherecentKeralafloodscausedmassivedestructionthatleftitsunpreparedcitizensinastateofdespairandchaos.Thiswastheworst…CreatingDumbbellChartsinRAndotheRstuffExploringTrendsonTwitterusingNLPAuthors:JacobStokes,YasiraYounus,GavinAndrewsTylerTakeyamaJumpingintoTechII:Wheretoland(DataAnalytics)TestingmodelperformanceinBeatMapSynthwithuserexperiencesurveysLearnPrincipalComponentAnalysisinRImproveYourFeatureSelectionProcesswithPCAAnomalyDetection:Part1Anomalous(adj.)“deviatingfromageneralrule,”1640s,fromLateLatinanomalous,fromGreekanomalous“uneven,irregular,”froman…GetstartedArthurArnx279FollowersCSstudent,FranceFollowRelatedKaggleDaysChampionship — ShanghaiThefirstvirtuallyKaggleDaysChampionshipMeetupChallenge — Classifyingyogaposesfromimages.ImageClassificationTransferLearningandFineTuningusingTensorFlowMaskDetectionwithCNNandOpenCVConvolutionalNeuralNetworksFundamentalsofConvolutionalNeuralNetworkwithImplemenation.HelpStatusWritersBlogCareersPrivacyTermsAboutKnowable



請為這篇文章評分?