Artificial Neural Networks Applications and Algorithms - Medium

文章推薦指數: 80 %
投票人數:10人

A neural network is a group of algorithms that certify the underlying relationship in a set of data similar to the human brain. The neural network helps to ... HomeNotificationsListsStoriesWriteArtificialNeuralNetworksApplicationsandAlgorithmsWhatisanArtificialNeuralNetwork?ArtificialNeuralNetworksarethecomputationalmodelsthatareinspiredbythehumanbrain.ManyoftherecentadvancementshavebeenmadeinthefieldofArtificialIntelligence,includingVoiceRecognition,ImageRecognition,RoboticsusingArtificialNeuralNetworks.ArtificialNeuralNetworksarethebiologicallyinspiredsimulationsperformedonthecomputertoperformcertainspecifictaskslike-ArtificialNeuralNetworks,ingeneral—isabiologicallyinspirednetworkofartificialneuronsconfiguredtoperformspecifictasks.ThesebiologicalmethodsofcomputingareconsideredtobethenextmajoradvancementintheComputingIndustry.WhatisaNeuralNetwork?Theterm‘Neural’isderivedfromthehuman(animal)nervoussystem’sbasicfunctionalunit‘neuron’ornervecellsthatarepresentinthebrainandotherpartsofthehuman(animal)body.Aneuralnetworkisagroupofalgorithmsthatcertifytheunderlyingrelationshipinasetofdatasimilartothehumanbrain.Theneuralnetworkhelpstochangetheinputsothatthenetworkgivesthebestresultwithoutredesigningtheoutputprocedure.YoucanalsolearnmoreaboutONNXinthisinsight.PartsofNeuronandtheirFunctionsThetypicalnervecellofthehumanbraincomprisesoffourparts-Itreceivessignalsfromotherneurons.Itsumsalltheincomingsignalstogenerateinput.Whenthesumreachesathresholdvalue,neuronfiresandthesignaltravelsdowntheaxontotheotherneurons.Thepointofinterconnectionofoneneuronwithotherneurons.Theamountofsignaltransmitteddependsuponthestrength(synapticweights)oftheconnections.Theconnectionscanbeinhibitory(decreasingstrength)orexcitatory(increasingstrength)innature.So,neuralnetwork,ingeneral,isahighlyinterconnectednetworkofbillionsofneuronwithtrillionofinterconnectionsbetweenthem.WhatistheDifferenceBetweenComputerandHumanBrain?ArtificialNeuralNetworkswithBiologicalNeuralNetwork—SimilarityNeuralNetworksresemblethehumanbraininthefollowingtwoways-Aneuralnetworkacquiresknowledgethroughlearning.Aneuralnetwork’sknowledgeisstoredwithininter-neuronconnectionstrengthsknownassynapticweights.VONNEUMANNARCHITECTUREBASEDCOMPUTINGANNBASEDCOMPUTINGSerialprocessing—processinginstructionandproblemruleoneattime(sequential)Parallelprocessing—severalprocessorsperformsimultaneously(multitasking)Functionlogicallywithasetofif&elserules—rule-basedapproachFunctionbylearningpatternfromgiveninput(image,textorvideo,etc.)Programmablebyhigher-levellanguagessuchasC,Java,C++,etc.ANNisinessenceprogramthemselves.Requireseitherbigorerror-proneparallelprocessorsUseofapplication-specificmulti-chips.ArtificialNeuralNetwork(ANN)WithBiologicalNeuralNetwork(BNN)—ComparisonHowDoesArtificialNeuralNetworkWorks?ArtificialNeuralNetworkscanbeviewedasweighteddirectedgraphsinwhichartificialneuronsarenodes,anddirectededgeswithweightsareconnectionsbetweenneuronoutputsandneuroninputs.TheArtificialNeuralNetworkreceivesinformationfromtheexternalworldintheformofpatternandimageinvectorform.Theseinputsaremathematicallydesignatedbythenotationx(n)fornnumberofinputs.Eachinputismultipliedbyitscorrespondingweights.Weightsaretheinformationusedbytheneuralnetworktosolveaproblem.TypicallyweightrepresentsthestrengthoftheinterconnectionbetweenneuronsinsidetheNeuralNetwork.Theweightedinputsareallsummedupinsidethecomputingunit(artificialneuron).Incasetheweightedsumiszero,biasisaddedtomaketheoutputnot-zeroortoscaleupthesystemresponse.Biashastheweightandinputalwaysequalto‘1′.Thesumcorrespondstoanynumericalvaluerangingfrom0toinfinity.Tolimittheresponsetoarriveatthedesiredvalue,thethresholdvalueissetup.Forthis,thesumispassedthroughanactivationfunction.Theactivationfunctionissettothetransferfunctionusedtogetthedesiredoutput.Therearelinearaswellasthenonlinearactivationfunction.Someofthecommonlyusedactivationfunctionis—binary,sigmoidal(linear)andtanhyperbolicsigmoidalfunctions(nonlinear).Binary—Theoutputhasonlytwovalueseither0and1.Forthis,thethresholdvalueissetup.Ifthenetweightedinputisgreaterthan1,outputisassumedasoneotherwisezero.SigmoidalHyperbolic—Thisfunctionhasan‘S’shapedcurve.Herethetanhyperbolicfunctionisusedtoapproximateoutputfromnetinput.Thefunctionisdefinedas—f(x)=(1/1+exp(-????x))where????—steepnessparameter.YouMayalsoLovetoReadOverviewofArtificialIntelligence&RoleofNaturalLanguageProcessinginBigDataTypesofNeuralNetworksinArtificialIntelligenceNeuralNetworkArchitectureTypesNeuralNetworkishavingtwoinputunitsandoneoutputunitwithnohiddenlayers.Thesearealsoknownas‘single-layerperceptrons’.Thesenetworksaresimilartothefeed-forwardNeuralNetworkexceptradialbasisfunctionisusedastheactivationfunctionoftheseneurons.Thesenetworksusemorethanonehiddenlayerofneurons,unlikesingle-layerperceptron.ThesearealsoknownasDeepFeedforwardNeuralNetworks.TypeofNeuralNetworkinwhichhiddenlayerneuronshaveself-connections.RecurrentNeuralNetworkspossessmemory.Atanyinstance,hiddenlayerneuronreceivesactivationfromthelowerlayeraswellasitspreviousactivationvalue.TypeofNeuralNetworkinwhichmemorycellisincorporatedintohiddenlayerneuronsiscalledLSTMnetwork.Afullyinterconnectednetworkofneuronsinwhicheachneuronisconnectedtoeveryotherneuron.Thenetworkistrainedwithinputpatternsbysettingavalueofneuronstothedesiredpattern.Thenitsweightsarecomputed.Theweightsarenotchanged.Oncetrainedforoneormorepatterns,thenetworkwillconvergetothelearnedpatterns.ItisdifferentfromotherNeuralNetworks.ThesenetworksaresimilartotheHopfieldnetworkexceptsomeneuronsareinput,whileothersarehiddeninnature.Theweightsareinitializedrandomlyandlearnthroughthebackpropagationalgorithm.GetacompleteoverviewofConvolutionalNeuralNetworksthroughourblogLogAnalyticswithMachineLearningandDeepLearning.Itisthecombinedstructureofdifferenttypesofaneuralnetworkslikemultilayerperceptron,HopfieldNetwork,RecurrentNeuralNetwork,etc.whichareincorporatedasasinglemoduleintothenetworktoperformindependentsubtaskofwholecompleteNeuralNetworks.InthistypeofArtificialNeuralNetwork,electricallyadjustableresistancematerialisusedtoemulatethefunctionofsynapseinsteadofsoftwaresimulationsperformedintheneuralnetwork.HardwareArchitectureforNeuralNetworksTwotypesofmethodsareusedforimplementinghardwareforNeuralNetworks.SoftwaresimulationinconventionalcomputerAspecialhardwaresolutionfordecreasingexecutiontime.WhenNeuralNetworksareusedwithafewernumberofprocessingunitsandweights,softwaresimulationisperformedonthecomputerdirectly.E.g.,voicerecognition,etc.WhenNeuralNetworkAlgorithmsdevelopedtothepointwhereusefulthingscanbedonewith1000’sofneuronsand10000’sofsynapses,high-performanceNeuralNetworkshardwarewillbecomeessentialforpracticaloperation.E.g.,GPU(Graphicalprocessingunit)inthecaseofDeepLearningalgorithmsintheeventofobjectrecognition,imageclassification,etc.Theperformanceoftheimplementationismeasuredbyconnectionperthesecondnumber(cps),i.e.,thenumberofthedatachunkistransportedthroughtheedgesoftheneuralnetwork.Whiletheperformanceofthelearningalgorithmismeasuredintheconnectionupdatespersecond(cups)LearningTechniquesinArtificialNeuralNetworksTheneuralnetworklearnsbyadjustingitsweightsandbias(threshold)iterativelytoyieldthedesiredoutput.Thesearealsocalledfreeparameters.Forlearningtotakeplace,theNeuralNetworkistrainedfirst.Thetrainingisperformedusingadefinedsetofrulesalsoknownasthelearningalgorithm.TrainingAlgorithmsForArtificialNeuralNetworksThisisthesimplesttrainingalgorithmusedinthecaseofasupervisedtrainingmodel.Incase,theactualoutputisdifferentfromthetargetoutput,thedifferenceorerrorisfindout.Thegradientdescentalgorithmchangestheweightsofthenetworkinsuchamannertominimizethismistake.Itisanextensionofthegradient-baseddeltalearningrule.Here,afterfindinganerror(thedifferencebetweendesiredandtarget),theerrorispropagatedbackwardfromtheoutputlayertotheinputlayerviathehiddenlayer.ItisusedinthecaseofMultilayerNeuralNetwork.LearningDataSetsinArtificialNeuralNetworksAsetofexamplesusedforlearning,thatistofittheparameters[i.e.,weights]ofthenetwork.Oneapproachcomprisesofonefulltrainingcycleonthetrainingset.Asetofexamplesusedtotunetheparameters[i.e.,architecture]ofthenetwork.ForexampletochoosethenumberofhiddenunitsinaNeuralNetwork.Asetofexamplesusedonlytoassesstheperformance[generalization]ofafullyspecifiednetworkortoapplysuccessfullyinpredictingoutputwhoseinputisknown.FiveAlgorithmstoTrainaNeuralNetworkHebbianLearningRuleSelf—OrganizingKohonenRuleHopfieldNetworkLawLMSalgorithm(LeastMeanSquare)CompetitiveLearningArtificialNeuralNetworkArchitectureAtypicalNeuralNetworkcontainsalargenumberofartificialneuronscalledunitsarrangedinaseriesoflayers.IntypicalArtificialNeuralNetwork,comprisesdifferentlayers-Inputlayer—Itcontainsthoseunits(ArtificialNeurons)whichreceiveinputfromtheoutsideworldonwhichthenetworkwilllearn,recognizeaboutorotherwiseprocess.Outputlayer—Itcontainsunitsthatrespondtotheinformationabouthowit’slearnedanytask.Hiddenlayer—Theseunitsareinbetweeninputandoutputlayers.Thejobofthehiddenlayeristotransformtheinputintosomethingthattheoutputunitcanuseinsomeway.MostNeuralNetworksarefullyconnectedwhichmeanstosayeachhiddenneuronisfullylinkedtoeveryneuroninitspreviouslayer(input)andtothenextlayer(output)layer.LearningTechniquesinNeuralNetworksInsupervisedlearning,thetrainingdataisinputtothenetwork,andthedesiredoutputisknownweightsareadjusteduntilproductionyieldsdesiredvalue.Theinputdataisusedtotrainthenetworkwhoseoutputisknown.Thenetworkclassifiestheinputdataandadjuststheweightbyfeatureextractionininputdata.Herethevalueoftheoutputisunknown,butthenetworkprovidesfeedbackonwhethertheoutputisrightorwrong.ItisSemi-SupervisedLearning.Theadjustmentoftheweightvectorandthresholdismadeonlyafterallthetrainingsetispresentedtothenetwork.ItisalsocalledBatchLearning.Theadjustmentoftheweightandthresholdismadeafterpresentingeachtrainingsampletothenetwork.LearningandDevelopmentinNeuralNetworksLearningoccurswhentheweightsinsidethenetworkgetupdatedaftermanyiterations.Forexample—Supposewehaveinputsintheformofpatternsfortwodifferentclassofpatterns—I&0asshownandb-biasandyasthedesiredoutput.Wewanttoclassifyinputpatternsintoeitherpattern‘I’&‘O.’Followingarethestepsperformed:Nineinputsfromx1—x9alongwithbiasb(inputhavingweightvalue1)isfedtothenetworkforthefirstpattern.Initially,weightsareinitializedtozero.Thenweightsareupdatedforeachneuronusingtheformulae:Δwi=xiyfori=1to9(Hebb’sRule)Finally,newweightsarefoundusingtheformulae:wi(new)=wi(old)+ΔwiWi(new)=[111–11–11111]Thesecondpatternisinputtothenetwork.Thistime,weightsarenotinitializedtozero.Theinitialweightsusedherearethefinalweightsobtainedafterpresentingthefirstpattern.Bydoingso,thenetworkThestepsfrom1–4arerepeatedforsecondinputs.ThenewweightsareWi(new)=[000-2-2-2000]So,theseweightscorrespondtothelearningabilityofthenetworktoclassifytheinputpatternssuccessfully.4DifferentTechniquesofNeuralNetworksANeuralNetworkcanbetrainedtoclassifyagivenpatternordatasetintoapredefinedclass.ItusesFeedforwardNetworks.ANeuralNetworkcanbetrainedtoproduceoutputsthatareexpectedfromagiveninput.E.g.,—Stockmarketprediction.TheNeuralnetworkcanbeusedtoidentifyauniquefeatureofthedataandclassifythemintodifferentcategorieswithoutanypriorknowledgeofthedata.Followingnetworksareusedforclustering-ANeuralNetworkcanbetrainedtoremembertheparticularpatternsothatwhenthenoisepatternispresentedtothenetwork,thenetworkassociatesitwiththeclosestoneinthememoryordiscardit.E.g.,HopfieldNetworkswhichperformsrecognition,classification,andclustering,etc.NeuralNetworksforPatternRecognitionPatternrecognitionisthestudyofhowmachinescanobservetheenvironment,learntodistinguishpatternsofinterestfromtheirbackground,andmakesoundandreasonabledecisionsaboutthecategoriesofthepatterns.Someexamplesofthepatternare—fingerprintimage,ahandwrittenword,humanfaceorspeechsignal.Givenaninputpattern,itsrecognitioninvolvesthefollowingtask-Supervisedclassification—Giventheinputpatternisidentifiedasthememberofapredefinedclass.Unsupervisedclassification—Patternisassignedtoahithertounknownclass.So,therecognitionproblemhereisessentiallyaclassificationorcategorizedtask.Thedesignofpatternrecognitionsystemsusuallyinvolvesthefollowingthreeaspects-ApproachesForPatternRecognitionTemplateMatchingStatisticalSyntacticMatchingArtificialNeuralNetworksFollowingNeuralNetworkarchitecturesusedforPatternRecognition-NeuralNetworkforDeepLearningFollowingNeuralNetwork,architecturesareusedinDeepLearningFeed-forwardneuralnetworksRecurrentneuralnetworkMulti-layerperceptrons(MLP)ConvolutionalneuralnetworksRecursiveneuralnetworksDeepbeliefnetworksConvolutionaldeepbeliefnetworksSelf-OrganizingMapsDeepBoltzmannmachinesStackedde-noisingauto-encodersNeuralNetworksandFuzzyLogicFuzzylogicreferstothelogicdevelopedtoexpressthedegreeoftruthinessbyassigningvaluesinbetween0and1,unliketraditionalbooleanlogicthatrepresents0and1.FuzzylogicandNeuralnetworkshaveonethingincommon.Theycanbeusedtosolveproblemsofpatternrecognitionandothersthatdonotinvolveanymathematicalmodel.Systemscombiningbothfuzzylogicandneuralnetworksareneuro-fuzzysystems.Thesesystems(Hybrid)cancombineadvantagesofbothneuralnetworksandfuzzylogictoperforminabetterway.FuzzylogicandNeuralNetworkshavebeenintegratedtouseinthefollowingapplications-AutomotiveengineeringApplicantscreeningofjobsControlofcraneMonitoringofglaucomaInahybrid(neuro-fuzzy)model,NeuralNetworksLearningAlgorithmsarefusedwiththefuzzyreasoningoffuzzylogic.Theneuralnetworkdeterminesthevaluesofparameters,whileif-thenrulesarehandledbyfuzzylogic.NeuralNetworkforMachineLearningMultilayerPerceptron(supervisedclassification)BackPropagationNetwork(supervisedclassification)HopfieldNetwork(forpatternassociation)DeepNeuralNetworks(unsupervisedclustering)ApplicationsofNeuralNetworksNeuralnetworkshavebeensuccessfullyappliedtothebroadspectrumofdata-intensiveapplications,suchas:AdvantagesofNeuralNetworksAneuralnetworkcanperformtasksthatalinearprogramcannot.Whenanelementoftheneuralnetworkfails,itcancontinuewithoutanyproblembytheirparallelnature.Aneuralnetworklearnsanddoesnotneedtobereprogrammed.Itcanbeimplementedinanyapplication.Itcanbeperformedwithoutanyproblem.LimitationsofNeuralNetworksFaceRecognitionUsingArtificialNeuralNetworksFacerecognitionentailscomparinganimagewithadatabaseofsavedfacestoidentifythepersoninthatinputpicture.Facedetectionmechanisminvolvesdividingimagesintotwoparts;onecontainingtargets(faces)andoneprovidingthebackground.Theassociatedassignmentoffacedetectionhasdirectrelevancetothefactthatimagesneedtobeanalyzedandfacesidentified,earlierthantheycanberecognized.LearningRulesinNeuralNetworkThelearningruleisatypeofmathematicallogic.ItencouragesaNeuralNetworktogainfromthepresentconditionsandupgradeitsefficiencyandperformance.Thelearningprocedureofthebrainmodifiesitsneuralstructure.Theexpandingordiminishingqualityofitssynapticassociationsrelyupontheiractivity.LearningrulesintheNeuralnetwork:Hebbianlearningrule;Itdetermines,howtocustomizetheweightsofnodesofasystem.Perceptronlearningrule;Networkstartsitslearningbyassigningarandomvaluetoeachload.Deltalearningrule;Modificationinsympatricweightofanodeisequaltothemultiplicationoferrorandtheinput.Correlationlearningrule;Itissimilartosupervisedlearning.HowCanXenonStackHelpYou?XenonStackcanhelpyoudevelopanddeployyourmodelsolutionsbasedonNeuralNetworks.Whateverkindofproblemyouface—Prediction,Classification,orPatternRecognition—XenonStackhasasolutionforyou.FraudDetection&PreventionServicesXenonStackFraudDetectionServicesoffersreal-timefraudanalysistoincreaseprofitability.DataMiningisusedtoquicklydetectfraudandsearchforspotpatternsanddetectfraudulenttransactions.DataMiningToolslikeMachineLearning,NeuralNetworks,ClusterAnalysisareusedtogeneratePredictiveModelstopreventfraudlosses.DataModelingServicesXenonStackoffersDataModellingusingNeuralNetworks,MachineLearning,andDeepLearning.DataModellingserviceshelpEnterprisestocreateaconceptualmodelbasedontheanalysisofdataobjects.DeployyourDataModelsonleadingCloudServiceProviderslikeGoogleCloud,MicrosoftAzure,AWSoronthecontainerenvironment—Kubernetes&Docker.Originallypublishedathttps://www.xenonstack.comonFebruary28,2019.MorefromXenonstackFollowAProductEngineeringandTechnologyServicescompanyprovidesDigitalenterpriseservicesandsolutionswithDevOps,BigDataAnalytics,DataScienceandAILovepodcastsoraudiobooks?Learnonthegowithournewapp.TryKnowableMorefromMediumDeployaMachineLearningModelasanAPIonAWSIntroductiontoVisualTrackingIntroductiontoSingle-TargetVisualTrackingIntroductiontotheSupportVectorMachineFromhowitclassifiestohowitperformsmulticlassclassificationUnderstandingGradientDescentinMachineLearningReinforcementLearningtosolveRubik’scube(andothercomplexproblems!)TestingStrategiesforSpeechApplicationsArtificialintelligencesystemsusingspeechservicesrequirespecialtestingconsiderations.Ihavepreviouslywrittengenericallyabout…LargeScaleObjectDetection&TrackingwithYOLOv5PackageTwominutesNLP — TopicModelingandSemanticSearchwithTop2VecGetstartedXenonstack88FollowersAProductEngineeringandTechnologyServicescompanyprovidesDigitalenterpriseservicesandsolutionswithDevOps,BigDataAnalytics,DataScienceandAIFollowRelatedIntroductiontoNeuralNetworksForSelfDrivingCars(FoundationalConcepts — Part2)MyTop-5ArticlesonMachineLearningandDeepLearningWhatisreinforcementlearning?ReinforcementlearningisabranchinML,whichdealswithanagenttryingtodosomethinginanenvironment.Theagentcanbetryingto…DeepLearningAtASurfaceLevelArtificialintelligence(AI)isrecognizedasthefutureofnotonlytechnology,buthumanityitself.Itisthegeneralconsensusthat…HelpStatusWritersBlogCareersPrivacyTermsAboutKnowable



請為這篇文章評分?