6 Types of Artificial Neural Networks Currently Being Used in ML

文章推薦指數: 80 %
投票人數:10人

6 Types of Artificial Neural Networks Currently Being Used in Machine Learning · 1. Feedforward Neural Network – Artificial Neuron: · 2. Radial ... Publishedon January15,2018 In DevelopersCorner 6TypesofArtificialNeuralNetworksCurrentlyBeingUsedinMachineLearning By KishanMaladkar Artificialneuralnetworksarecomputationalmodelsthatworksimilarlytothefunctioningofahumannervoussystem.Thereareseveralkindsofartificialneuralnetworks.Thesetypesofnetworksareimplementedbasedonthemathematicaloperationsandasetofparametersrequiredtodeterminetheoutput.Let’slookatsomeoftheneuralnetworks: 1.FeedforwardNeuralNetwork–ArtificialNeuron: ThisneuralnetworkisoneofthesimplestformsofANN,wherethedataortheinputtravelsinonedirection.Thedatapassesthroughtheinputnodesandexitontheoutputnodes.Thisneuralnetworkmayormaynothavethehiddenlayers.Insimplewords,ithasafrontpropagatedwaveandnobackpropagationbyusingaclassifyingactivationfunctionusually. BelowisaSinglelayerfeed-forwardnetwork.Here,thesumoftheproductsofinputsandweightsarecalculatedandfedtotheoutput.Theoutputisconsideredifitisaboveacertainvaluei.ethreshold(usually0)andtheneuronfireswithanactivatedoutput(usually1)andifitdoesnotfire,thedeactivatedvalueisemitted(usually-1). ApplicationofFeedforwardneuralnetworksarefoundincomputervisionandspeechrecognitionwhereclassifyingthetargetclassesiscomplicated.ThesekindofNeuralNetworksareresponsivetonoisydataandeasytomaintain.ThispaperexplainstheusageofFeedForwardNeuralNetwork.TheX-Rayimagefusionisaprocessofoverlayingtwoormoreimagesbasedontheedges.Hereisavisualdescription. 2.RadialbasisfunctionNeuralNetwork: Radialbasicfunctionsconsiderthedistanceofapointwithrespecttothecenter.RBFfunctionshavetwolayers,firstwherethefeaturesarecombinedwiththeRadialBasisFunctionintheinnerlayerandthentheoutputofthesefeaturesaretakenintoconsiderationwhilecomputingthesameoutputinthenexttime-stepwhichisbasicallyamemory. Belowisadiagramthatrepresentsthedistancecalculatingfromthecentertoapointintheplanesimilartoaradiusofthecircle.Here,thedistancemeasureusedineuclidean,otherdistancemeasurescanalsobeused.Themodeldependsonthemaximumreachortheradiusofthecircleinclassifyingthepointsintodifferentcategories.Ifthepointisinoraroundtheradius,thelikelihoodofthenewpointbeginclassifiedintothatclassishigh.Therecanbeatransitionwhilechangingfromoneregiontoanotherandthiscanbecontrolledbythebetafunction. ThisneuralnetworkhasbeenappliedinPowerRestorationSystems.Powersystemshaveincreasedinsizeandcomplexity.Bothfactorsincreasetheriskofmajorpoweroutages.Afterablackout,powerneedstoberestoredasquicklyandreliablyaspossible.Thispaper howRBFnnhasbeenimplementedinthisdomain. Powerrestorationusuallyproceedsinthefollowingorder: Thefirstpriorityistorestorepowertoessentialcustomersinthecommunities.Thesecustomersprovidehealthcareandsafetyservicestoallandrestoringpowertothemfirstenablesthemtohelpmanyothers.Essentialcustomersincludehealthcarefacilities,schoolboards,criticalmunicipalinfrastructure,andpoliceandfireservices. Thenfocusonmajorpowerlinesandsubstationsthatservelargernumbersofcustomers Givehigherprioritytorepairsthatwillgetthelargestnumberofcustomersbackinserviceasquicklyaspossible Thenrestorepowertosmallerneighborhoodsandindividualhomesandbusinesses Thediagrambelowshowsthetypicalorderofthepowerrestorationsystem. Referringtothediagram,firstprioritygoestofixingtheproblematpointA,onthetransmissionline.Withthislineout,noneofthehousescanhavepowerrestored.Next,fixingtheproblematBonthemaindistributionlinerunningoutofthesubstation.Houses2,3,4and5areaffectedbythisproblem.Next,fixingthelineatC,affectinghouses4and5.Finally,wewouldfixtheservicelineatDtohouse1. 3.KohonenSelfOrganizingNeuralNetwork: TheobjectiveofaKohonenmapistoinputvectorsofarbitrarydimensiontodiscretemapcomprisedofneurons.Themapneedstobetrainedtocreateitsownorganizationofthetrainingdata.Itcompriseseitheroneortwodimensions.Whentrainingthemapthelocationoftheneuronremainsconstantbuttheweightsdifferdependingonthevalue.Thisself-organizationprocesshasdifferentparts,inthefirstphase,everyneuronvalueisinitializedwithasmallweightandtheinputvector. Inthesecondphase,theneuronclosesttothepointisthe‘winningneuron’andtheneuronsconnectedtothewinningneuronwillalsomovetowardsthepointlikeinthegraphicbelow.Thedistancebetweenthepointandtheneuronsiscalculatedbytheeuclideandistance,theneuronwiththeleastdistancewins.Throughtheiterations,allthepointsareclusteredandeachneuronrepresentseachkindofcluster.ThisisthegistbehindtheorganizationofKohonenNeuralNetwork. KohonenNeuralNetworkisusedtorecognizepatternsinthedata.Itsapplicationcanbefoundinmedicalanalysistoclusterdataintodifferentcategories.Kohonenmapwasabletoclassifypatientshavingglomerularortubularwithanhighaccuracy.Hereisadetailedexplanationofhowitiscategorizedmathematicallyusingtheeuclideandistancealgorithm.Belowisanimagedisplayingacomparisonbetweenahealthyandadiseasedglomerular. 4.RecurrentNeuralNetwork(RNN)–LongShortTermMemory: TheRecurrentNeuralNetworkworksontheprincipleofsavingtheoutputofalayerandfeedingthisbacktotheinputtohelpinpredictingtheoutcomeofthelayer. Here,thefirstlayerisformedsimilartothefeedforwardneuralnetworkwiththeproductofthesumoftheweightsandthefeatures.Therecurrentneuralnetworkprocessstartsoncethisiscomputed,thismeansthatfromonetimesteptothenexteachneuronwillremembersomeinformationithadintheprevioustime-step. Thismakeseachneuronactlikeamemorycellinperformingcomputations.Inthisprocess,weneedtolettheneuralnetworktoworkonthefrontpropagationandrememberwhatinformationitneedsforlateruse.Here,ifthepredictioniswrongweusethelearningrateorerrorcorrectiontomakesmallchangessothatitwillgraduallyworktowardsmakingtherightpredictionduringthebackpropagation.ThisishowabasicRecurrentNeuralNetworklookslike, TheapplicationofRecurrentNeuralNetworkscanbefoundintexttospeech(TTS)conversionmodels.ThispaperenlightensaboutDeepVoice,whichwasdevelopedatBaiduArtificialIntelligenceLabinCalifornia.Itwasinspiredbytraditionaltext-to-speechstructurereplacingallthecomponentswithneuralnetwork.First,thetextisconvertedto‘phoneme’andanaudiosynthesismodelconvertsitintospeech.RNNisalsoimplementedinTacotron2:Human-likespeechfromtextconversion.Aninsightaboutitcanbeseenbelow, 5.ConvolutionalNeuralNetwork: Convolutionalneuralnetworksaresimilartofeedforwardneuralnetworks,wheretheneuronshavelearnableweightsandbiases.ItsapplicationhasbeeninsignalandimageprocessingwhichtakesoverOpenCVinthefieldofcomputervision. BelowisarepresentationofaConvNet,inthisneuralnetwork,theinputfeaturesaretakeninbatch-wiselikeafilter.Thiswillhelpthenetworktoremembertheimagesinpartsandcancomputetheoperations.ThesecomputationsinvolvetheconversionoftheimagefromRGBorHSIscaletotheGray-scale.Oncewehavethis,thechangesinthepixelvaluewillhelptodetecttheedgesandimagescanbeclassifiedintodifferentcategories. ConvNetareappliedintechniqueslikesignalprocessingandimageclassificationtechniques.Computervisiontechniquesaredominatedbyconvolutionalneuralnetworksbecauseoftheiraccuracyinimageclassification.Thetechniqueofimageanalysisandrecognition,wheretheagricultureandweatherfeaturesareextractedfromtheopen-sourcesatelliteslikeLSATtopredictthefuturegrowthandyieldofaparticularlandarebeingimplemented. 6.ModularNeuralNetwork: ModularNeuralNetworkshaveacollectionofdifferentnetworksworkingindependentlyandcontributingtowardstheoutput.Eachneuralnetworkhasasetofinputsthatareuniquecomparedtoothernetworksconstructingandperformingsub-tasks.Thesenetworksdonotinteractorsignaleachotherinaccomplishingthetasks. Theadvantageofamodularneuralnetworkisthatitbreakdownsalargecomputationalprocessintosmallercomponentsdecreasingthecomplexity.Thisbreakdownwillhelpindecreasingthenumberofconnectionsandnegatestheinteractionofthesenetworkswitheachother,whichinturnwillincreasethecomputationspeed.However,theprocessingtimewilldependonthenumberofneuronsandtheirinvolvementincomputingtheresults. Belowisavisualrepresentation, ModularNeuralNetworks(MNNs)isarapidlygrowingfieldinartificialNeuralNetworksresearch.ThispapersurveysthedifferentmotivationsforcreatingMNNs:biological,psychological,hardware,andcomputational.Then,thegeneralstagesofMNNdesignareoutlinedandsurveyedaswell,viz.,taskdecompositiontechniques,learningschemesandmulti-moduledecision-makingstrategies. MoreGreatAIMStories Top12‘No-Code’MachineLearningPlatformsIn2021 HowCMRGroupLeveragesAI&AnalyticsToDriveItsRetailBusiness HowSirionLabsUsesAIToOfferContractManagementSolutions SymphonyAcquiresMcAfee’sEnterpriseSegment.WhyIsItSuchABigDeal? GuidetoMedicalTransformer:AttentionforMedicalImageSegmentation NowMachineLearningHelpsInInterpretingBatteryLife KishanMaladkarholdsadegreeinElectronicsandCommunicationEngineering,exploringthefieldofMachineLearningandArtificialIntelligence.ADataScienceEnthusiastwholovestoreadaboutthecomputationalengineeringandcontributetowardsthetechnologyshapingourworld.HeisaDataScientistbydayandGamerbynight. OurupcomingEvents Conference,VirtualMicrosoftDataNext22ndMar Register Workshop,VirtualGettingstartedwithIntel®OptimisationforPyTorch25thMar Register Conference,in-person(Bangalore)Rising2022|WomeninAIConference8thApr Register Conference,VirtualDataEngineeringSummit202230thApr Register Conference,in-person(Bangalore)MachineCon202224thJun Register Conference,in-person(Bangalore)Cypher202221-23rdSep Register 3WaystoJoinourCommunity DiscordServerStayConnectedwithalargerecosystemofdatascienceandMLProfessionals JoinDiscordCommunity TelegramChannelDiscoverspecialoffers,topstories,upcomingevents,andmore. JoinTelegram Subscribetoournewsletter GetthelatestupdatesfromAIM Email Subscribe MOREFROMAIM DeepMindopen-sourcesDM21,anAImodelforquantumchemistry DM21usesaneuralnetworktoapproximatetheenergyfunctioncomponentofDensityFunctionalTheory. OpenAI’sneuraltheoremprovercansolveMathOlympiadproblems Thetheoremproverachieved41.2%vs29.3%ontheminiF2Fbenchmark,achallengingcollectionofhigh-schoololympiadproblems. HowcanFederatedlearningbeusedforspeechemotionrecognition? Usingspeechdatawecanextractalotofinformationaboutspeakerslikeage,identity,language,andgender,Whichcanbeaconfidentialpart.federatedlearningenvironmentenhancesdatasecurityanddataprivacy. WhatisaQuantumConvolutionalNeuralNetwork? ConvolutionalNeuralNetworkshavethelimitationthattheylearninefficientlyifthedataormodeldimensionisverylarge. HowtoUseLearningRateAnnealingwithNeuralNetworks? Learningrateisanimportantparameterinneuralnetworksforthatweoftenspendmuchtimetuningitandweevendon’tgettheoptimumresulteventryingforsomedifferentrates. TeamOfSoftwareEngineersAtFacebookReleases“NeuralNetworkCompiler”ForPyTorch1.10 AteamofsoftwareengineersatFacebook,ledbySoftwareEngineerBertrandMaher,recentlyreleasedaJITcompilerforCPUsbasedonLLVM,calledNNC,for“NeuralNetworkCompiler.” MajorAnnouncementsByJensenHuangDuringNVIDIAGTCKeynoteSpeech JensenHuangsaidinthekeynotespeechthatwithOmniverse,userswillbeabletocreatenew3Dmodelsofthephysicalworld. IsDepthInNeuralNetworksAlwaysPreferable?ThisResearchSaystheContrary Non-deepnetworkscouldbeutilisedtocreatelow-latencyrecognitionsystems,ratherthandeepnetworks. GoogleAIReleasesMethodToDetermineNeuralNetworkLearningSequence AmethodcalledTaskAffinityGroupings(TAG)hasbeenproposedbyGoogleAIthatdetermineswhichtasksshouldbetrainedtogetherinmulti-taskneuralnetworks.  High-FidelityQuantumComputingIsNowPossible,ThanksAI Toreadelectronspinstatesonquantumdots,SANKENresearchersusemachinelearning. Ourmissionistobringaboutbetter-informedandmoreconsciousdecisionsabouttechnologythroughauthoritative,influential,andtrustworthyjournalism. ShapeTheFutureofAI CONTACTUS⟶ ©AnalyticsIndiaMagazinePvtLtd2022 Termsofuse PrivacyPolicy Copyright



請為這篇文章評分?