5 Different Types of Neural Networks - ProjectPro
文章推薦指數: 80 %
Autoencoders: These are a special kind of neural network that consists of three main parts: encoder, code, and decoder. For these networks, the ... SolvedProjects CustomerReviews CustomProjectPathNew DataScienceProjectPath BigDataProjectPath Blog WriteforProjectPro EndtoEndProjects 5DifferentTypesofNeuralNetworks LastUpdated:28Feb2022 GETNOW -Amostlycompletechartofneuralnetworksishere-Understandtheideabehindtheneuralnetworkalgorithm,thedefinitionofaneuralnetwork,themathematicsbehindtheneuralnetworkalgorithm,andthedifferenttypesofneuralnetworkstobecomeaneuralnetworkpro. Let'sHaveSomeFunBeforeThat...GameTime! Insteadofstartingwithamostlycompleteneuralnetworkchart,letusplayafungamefirst.Belowyou'llfindamixtureofredballsandblackcircles;yourtaskistocountthenumberofballsofeachcolor. Tooeasy,right?Well,formosthumans,itis.But,whatifIwantedacomputertosolvethistask?Isitpossibleforittodothat?Itturnsoutitis.AsimilarproblemsolvedbyoneoftheprofessorsfromCornellUniversity(CU)isnowwidelyconsideredasthefirststeptowardsArtificialIntelligence.In1958,FrankRosenblattfromCUsuccessfullydemonstratedthatacomputercouldseparatecardsmarkedontheleftfromcardsmarkedontherightafter50trials.Letusfindoutinthenextsectionhowexactlyhedidthat. TimeSeriesPythonProjectusingGreykiteandNeuralProphet Downloadablesolutioncode|Explanatoryvideos|TechSupport ExploreProject TableofContents WhatisaPerceptron? MathematicalModelofthePerceptron Algorithm WhatareNeuralNetworks? HowDoNeuralNetworkswork? DifferentKindsofNeuralNetworks ArtificialNeuralNetwork RadialBasisFunctionalNeuralNetwork (RBFNN) ConvolutionalNeuralNetwork(CNN) RecurrentNeuralNetworks(RNN) Autoencoders MasteringNeuralNetworksthroughHands-OnProjects WhatisaPerceptron? Perceptronisoneofthesimplestbinaryclassifiers;itseparatestwoclassesfromeachotherby learning their features.Forexample,considerthefamous IrisDataset withfeatures-widthsandlengthsofsepalsandpetalsforthreeclassesofflowers:Irissetosa,virginica,andversicolor.ThedatasetwascollectedbyDr.EdgarAndersonandcontains150instances,eachhavingfourlengthvaluesandacorrespondingclassofflowerswithit. Image:IrisFlowers(left)andfourparametersthatformthefeaturesofIrisDataset(right).Source:Freepik.com(left),DigitalImageProcessingTextbook[1] Tokeepthingssimple,letusconsideronlytwofeatures-petallength(cm)andsepallength(cm)fortwoflowersIrissetosaandIrisversicolor.Andifweplotthesefeaturesonagraph,thisiswhatitwilllooklike: Carefullyobservethegraphandnotethatwecaneasilyseparatethetwoflowersfromeachotherbasedonthetwocharacteristics.Inotherwords,onecaneffortlesslydrawastraightlinebetweenthetwoandsetthethresholdvaluesforthetwolengthsforeachflower.Perceptronsolvesthisproblem.Ittriestocomeupwiththerequiredequationofaline.Buthowisthatpossible?We'llexploretheanswertothisnow. NewProjects PySparkProjectforBeginnerstoLearnDataFrameOperations ViewProject BuildaHybridRecommenderSysteminPythonusingLightFM ViewProject GraphDatabaseModellingusingAWSNeptuneandGremlin ViewProject LearnHowtoBuildaLinearRegressionModelinPyTorch ViewProject BuildMultiClassTextClassificationModelswithRNNandLSTM ViewProject SnowflakeAzureProjecttobuildreal-timeTwitterfeeddashboard ViewProject LearnHowtoBuildPyTorchNeuralNetworksfromScratch ViewProject SQLProjectforDataAnalysisusingOracleDatabase-Part5 ViewProject LearnHyperparameterTuningforNeuralNetworkswithPyTorch ViewProject MLOpsusingAzureDevopstoDeployaClassificationModel ViewProject ViewallNewProjects MathematicalModelofthePerceptron Inessence,aperceptrontakesinfeaturesofaninstance(x ={x1,x2,x3,...,xn})fromthedataset,multiplieseachfeaturevaluebycertainweights(w ={w1,w2,w3,...,wn})andaddsabiasterm(b)toit.Thisfunction,h(x),mapstheinputvectortotheactivationfunction'soutput.Lookatthefigurebelowthatwillhelpyouunderstandthisbetter. Theoutputofthefunctionh(x)decidestheinstancebelongstowhichclass.Iftheresultisabovezero,wesaytheinstance x belongstoclassA1.Otherwise,iftheoutputislessthanzero,itbelongstoclassA2.Wecanwritethismathematicallyas, But,howdoesaperceptronlearntheseweightssothattheinstanceislabeledwithitscorrectclass?Wearenowreadytoanswerthis. GetCloserToYourDreamofBecomingaDataScientistwith70+SolvedEnd-to-EndMLProjects Algorithm: Forsimplicity,weconsidertheinputfeaturesasavectorand1toitattheendsothattheinputtotheactivationfunctioniswrittenasy={y1,y2,y3,...,yn,1})andtheweightvectorbecomes(w ={w1,w2,w3,...,wn,b}).Wecannowwritethefunctionhas- Here,thevectory andw arecalledtheaugmentedinputvectorandweightvectorrespectively.Usingthisnotation,thealgorithmofaperceptroncanbewrittenas: Considertheweightvectorw1 witharbitraryvalues.Theweightsvectorwillnowbeupdatesusingthefollowing: whereβ>0representsacorrectionincrement/thelearningincrement/thelearningrate.ThefirsttwocasesrefertothesituationwheretheclasseshavebeenwronglyidentifiedbythePerceptron.Thus,inthiscase,theweightswillhavetobeupdated.And,iftheclasshasbeenidentifiedcorrectly,theweightsneednotchangeascanbeseeninthethirdstep.Theseweightscanthenbeusedtoplotthelinethatseparatesthetwoclasses.Youmaywonderhowsuchasimplealgorithmcangivethecorrectansweralways.Well,itcannot.TheapplicationofthePerceptronalgorithmislimitedtocaseswherethetwoclassescanbeseparatedlinearly.Thatis,weonlyneedtodrawalinetoseparatetheobjectsoftwoclasses.Andthat’stheonlycasewherethisalgorithmconvergestogivethecorrectweights. Beforewemoveontothesnippetofcodethatimplementsthisalgorithm,letusplayafunquiz. Question:WhatwasFrankRosenblattworkingonthatledtothebirthoftheideaofaPerceptron? Studyingthewayneuronsinahumanbraintransferinformation Studyingtheway,theflydecidesinitseyethatdeterminesitspathofflee Studyingthebehaviorofacattowardsredandblueballs Studyingtheresponseofapreyfishtopredators CODE: Thecodeissimpleandeasytounderstand.Readthecommentsforabetterexplanation.TestYourself! ImplementtheabovecodeonthetwoclassesofIrisDatasetandclassifythemonthebasisofsepallengthandpetallength.Also,don’tforgettousetheweightstodrawthelinethatseparatesthetwoclassesonagraph. Wearenowreadytomoveontooneofthemostwidelyusedalgorithms,theNeuralNetworks.ThisalgorithmissomewhatbasedonthePerceptronalgorithmthatwejustfinishedlearning.Ifallthiswasabitrigorousforyou,pleasegograbasnackandrewardyourselfforcomingthisfar. RecommendedReading CNNvsRNN-ChoosetheRightNeuralNetworkforYourProject WhatareNeuralNetworks? DefinitionofNeuralNetwork: NeuralNetwork,asthenamesuggests,isanetworkofneuronswhereeachneuronbehaveslikeaperceptronthatwejustfinisheddiscussing.Thealgorithmisbasedupontheoperationsofabiologicalneuralsystem.Itaimsatrecognizingthepatternbetweentheinputfeaturesandtheexpectedoutputbyminimizingtheerrorbetweenthepredictedoutcomeandtheactualoutput. NeuralNetworksandDeepLearning: DeepLearningisasubfieldofmachinelearningthatconsistsofalgorithmsthatmimichowahumanbrainfunction.Andthebasisofmostsuchalgorithmsistheneuralnetwork(NN).Thereasonforitspopularityisthelargenumberofproblemsithasassistedinsolving.FromFaceRecognitiontoObjectDetectiontoStockPrediction,NNsareattheheartofallsuchsolutions.TheapplicationsofNNarenomorelimitedtoimagesornumbers.WiththeinventionofexcitingalgorithmarchitectureslikeLSTM,GRU,neuralnetworkshaveexpandedtheirapplicationstoNaturalLanguageProcessingproblems.So,whatliesinaneuralnetworkalgorithm?Continuetofindout. ExploreCategories DataScienceProjectsinPython MachineLearningProjectsinPython DeepLearningProjects NeuralNetworkProjects TensorflowProjects H2ORProjects IoTProjects KerasDeepLearningProjects NLPProjects Pytorch HowDoNeuralNetworkswork? Letusbeginwiththemostcommonwayofvisualizinganeuralnetworkarchitecture,asshowninfigure1. Aneuralnetworktakesafeaturevectorfromthedatasetasinput,justlikeaperceptron.Butunlikeperceptron,thisalgorithmworksformorethantwoclasses.Thus,itcanhavemorethantwooutputs.Letusunderstandthisalgorithmstepbystep. Thefirststepbeginsattheinputlayer(Fig.1),wheretheneuralnetworkreceivesthefeaturevector,x={x1,x2,x3,...,xn}fromthedataset.Eachorange-coloredcircleoffig.1representsanelementofthisfeaturevector. Thenextstepinvolvesconnectingtheinputvectortoalltheneuronsofnextlayer.Eachneuronofthislayerreceivesweightedsumofinputvector-elementsalongwithabiasterm.Mathematically,thiswouldmean: Theoutcomeisthenpassedthroughanactivationfunctiona(x)sothattheoutputofeachneuronisgivenby Someofthepopularactivationfunctionsarelistedbelow 1ImageSource:HandbookofNeuralNetworkSignalProcessing[2] Repeatstep2forallthehiddenlayers-layersthatliebetweentheinputandoutputlayer.But,thekeypointtorememberisthattheactivationfunctionsneednotbesameforallthehiddenlayers.Thus,dependingontheproblemathand,theoutputlayerusuallyhasdifferentactivationfunctionastheneuronsofoutputlayerareresponsibleforlabellingthefeaturevectortooneoftheexpectedclasses.Thenumberofneuronsintheoutputlayerhavetobesameasthenumberofexpectedclasses,eachrepresentingoneclass.Theneuronthatgeneratesthehighestvalueasanoutputidentifiestheclassfortheinputfeaturevector. GetFREEAccesstoMachineLearningExampleCodesforDataCleaning,DataMunging,andDataVisualization Nowthatwehavefiguredouthowtheoutputisevaluated,theremainingpartofunravelingishowthenetworkwilllearnthecorrectweights.Forthat,wefirstcomputetheerrorfunctionusingtheoutputneuronsgivenby Ei istheerrorforasinglepatternvector: xâ andisdefinedas, j=1,2,3,…,NÊ=numberofdifferentclassesinthedataset; ojistheoutputvalueofthejáµÊ° neuronoftheoutputlayer,andzj is thedesiredresponseforthejthneuronoftheoutputlayer. But,thisisnottheonlyfunctionthatisinusetoday.Thereareavarietyofoptionsavailable,andyoucanexplorethemallhere: Oncetheerrorisevaluatedattheoutput,itneedstobeminimized.Andthatwillonlybecomepossiblewhenthewholenetworkhaslearnedthecorrectweights.Theerrorispropagatedbacktothepreviouslayerstoensurethenetworklearnsthecorrectweights.Wecanunderstandhowthisworksbyconsideringtheapplicationofthegradientdescentalgorithm.Theweightswilladjustinproportiontothepartialderivativeoftheerrorfunction.Thatis, whereαrepresentsthelearningparameterandthesuperscriptsdenotethelayerwhoseparametersarebeingconsidered. Afterperformingthenecessaryalgebra,weendupwiththefollowingalgorithm:Foranytwolayerslandl-1,theweightsthatconnectsthetwolayersaremodifiedusing Ifjdenotestheneuronoftheoutputlayer(l=L),theparameterδisevaluatedas Ifjdenotestheneuronofahiddenlayerlandprepresentsaneuronofhiddenlayerl+1,theparameterδisevaluatedas That'sall.Weareallsetwiththemathematics.Grabanothersnacktoenergizeyourselfforthenextsection. Getconfidenttobuildend-to-endprojects. Accesstoacuratedlibraryof120+end-to-endindustryprojectswithsolutioncode,videosandtechsupport. Requestademo DifferentKindsofNeuralNetworks: Nowthatyouknowthebasicsofafeedforwardneuralnetwork,letusexplorehowwecanaddinterestinglayerstosolveexcitingproblems. ArtificialNeuralNetwork:TheneuralnetworkthatweexplainedintheprevioussectionisoftenreferredtoasArtificialNeuralNetwork.Wecanthuseasilyskipthisoneaswehavediscusseditalready, RadialBasisFunctionalNeuralNetwork (RBFNN):Aspecialneuralnetworkclassconsistingofonlythreelayers:inputlayer,hiddenlayer,andoutputlayer.Asisevidentfromthename,itutilizesRadialBasisFunctions(RBFs)likegaussian,thinplatespline,multi-quadratic,etc.,asanactivationfunctionforthehiddenlayers.ItworkslikeK-MeansClusteringAlgorithm.Thus,itisusedinsituationswheretheinstancesarenotlinearlyseperable.TheideaofusingRBFistotransformthevariablesintoahigherdimensionwheretheinstancesofourdatasetbecomelinearlyseparable.HereiswhatthearchitectureofanRBFNNlookslike: ThetrainingalgorithmforanRBFNNisdifferentfromtheANNandrequiresafewmoreparametersotherthanlearningincrementforcomputation. ConvolutionalNeuralNetwork(CNN): Asthenamesuggests,thisneuralnetworkinvolvestheconvolutionoperation.ThistypeofneuralnetworkhaswideapplicationsinImageClassificationandObjectDetection.Itreceivesanimageattheinputandthefeaturesoftheimageareextractedthroughtheconvolutionoperation.Theconvolutionoperationismathematicallydefinedas: whereyrepresentstheinputimagevector,wrepresentstheweights/filter/kernel,ands=(t-1)/2where1xtistheoddsize ofkernel. â Asanexample,considerthefollowingvaluesfortheinputvectory =[2,1,2,3,4,6,8,1]andw =[0,1,0,0,0]. GetMorePractice,MoreDataScienceandMachineLearningProjects,andMoreguidance.Fast-TrackYourCareerTransitionwithProjectPro Notethatwehaveaproblemifwestartfromtheoriginaswecannotdefinetheoperationthere.Andthesolutionforthisisthepaddingoperationwhichinvolvesaddingthenumberofzeroestotheinputvectorsothattheconvolutionoperationcanbedefined. Thus,theoutputinthiscaseforx=0wouldbe 2ArchitectureofLeNet-5.[3] Noticetheinputtothenetworkisanimage.TherearemultipleconvolutionlayersdenotedbyCandsubsamplinglayers,representedbyS.Thesubsamplinglayersaresimplelayersthatcontractthesizebyusingoperationslikeaverage,maximumofthefourelements,etc.Thismodel,LeNet-5,wasutilizedbytheauthorstorecognizethehandwrittenandmachine-printedcharacters.Therecanbemanymoreexcitingapplicationslikeyoucanuseitforidentifyingyourfavoritecartoon 3Imagesource:seekpng.com Andifyoudon'tgetaccurateresultsusingLeNet-5,youmayswitchtomorerecentCNNslikeAlexNet,VGG,Resnet,Inception,Xception,etc. MostWatchedProjects SnowflakeRealTimeDataWarehouseProjectforBeginners-1 ViewProject BuildanAWSETLDataPipelineinPythononYouTubeData ViewProject AWSSnowflakeDataPipelineExampleusingKinesisandAirflow ViewProject MachineLearningprojectforRetailPriceOptimization ViewProject SQLProjectforDataAnalysisusingOracleDatabase-Part1 ViewProject ViewallMostWatchedProjects RecurrentNeuralNetworks(RNN): Thewordrecurrentmeans"occurringoftenorrepeatedly."Thenamesuggeststhattheremustbesomethinglikeanoperationhappeningmanytimesorarepeatedcalculation.AndthatisindeedthecasewithRNN.InRNN,eachoutputelementisevaluatedasafunctionofpreviouselementsoftheoutput.And,alltheoutputelementsarecalculatedbyapplyingthesameruleofupdatingtheearlieroutcomes.ThisispossiblebecauselayersofRNNarekindenoughtoallowweight-sharing.Tounderstandthisbetter,considerthefigurebelow. ThisfiguresumsupthebasicideaofRNN.Theinputvectorofspecificdimensionsisfedtothehiddenlayers,andtheoutputisevaluated.However,thereisalsoacirculararrowthatpointsbackattheinput.Thisisreferringtothefactthattheoutputisbeingfedbacktothenetwork. RNNsareusedforprocessingsequentialdata.Forexample,inNaturalLanguageProcessing(NLP)applications,predictingthenextwordinasentencekeepingthesequenceofwordsalreadyenteredinmind.WeseeGoogleKeyboardhelpinguswiththiseveryday. So,iftherearefourwordsinasentenceandwewanttopredictthefifthword,wecanuseRNN.Thenetworkwillunravelitselfbyproducingfourcopiesofitslayers,oneforeachword.Thetermsare,ofcourse,convertedtovectorsusingembeddingtechniqueslikeword2vec,one-hotencoding,etc.Thenetworkstartswithevaluatingthefirstword,x1attimet=1.Afterthat,theoutputs1isassessedusinganactivationfunction.Next,attimet=2,theoutputisfedbacktotheinputandeventhesecondwordofthesentence.Again,theoutcomeisevaluatedusinganactivationfunctionandsoon.Noticetheweightparametersareremainthesameforallthecalculations,therebysuggestingtherecurrentbehaviorofRNN.Notethattherecurrenceistherewithrespecttotime. Afterevaluatingthefinaloutput,thelossfunctionisevaluated,andtheerrorispropagatedbacktoupdatetheweights.ManyrecentalgorithmslikeLongShortTermMemorynetworks(LSTM), GatedRecurrentUnits(GRU),andattention-basedmodelshaveRNNsasapartoftheirarchitecture. Autoencoders: Theseareaspecialkindofneuralnetworkthatconsistsofthreemainparts:encoder,code,anddecoder.Forthesenetworks,theinputisthesameasthatoftheoutput.Theycompresstheinformationreceivedattheinputintoalower-dimensionalcode,whichtheythenusetorebuildtheresult.BoththeencoderanddecoderhaveanANN-basedarchitectureandareusuallyamirrorimageofeachother.Theideaofusingacodebetweenanencoderandadecoderistointroduceafewchangesintheinputvectorandstillexpectthesameoutput.Itmightseemoddatfirstbutimagineifyoupassarandomimageattheinput,thenanautoencoderwillbeabletopresentyouapicturewithoutnoiseeasily.Theyarethuswidelyusedforanomalydetection,datadenoising,anddimensionalityreduction. AccessDataScienceandMachineLearningProjectCodeExamples MasteringNeuralNetworksthroughHands-OnProjects Congratulations!YouarenowdonewithlearningaboutoneofthemostfamousalgorithmsusedbyDataScientists.But,astheysay,knowledgeisincompletewithoutaction,itisthusimportantthatyouexplorerelevantcodestoowhichcanguideyouabouthowtoapplyNeuralNetworkalgorithmsforsolvingreal-worldproblems.ToolazytogoogleforNeuralNetworkprojectideas?Don’tworry,we’vegotyoucoveredwithsome innovativeNeuralNetworkProjectIdeas thatwilladdgreatvaluetoyourdatascienceormachinelearningportfolio. References Gonzalez,R.C.,&Woods,R.E.(2002).Digitalimageprocessing. Hu,Y.H.,&Hwang,J.(2002).HandbookofNeuralNetworkSignalprocessing. LeCun,Y.,Bottou,L.,Bengio,Y.&Haffner,P.(1998).Gradient-BasedLearningAppliedtoDocumentRecognition. ProceedingsoftheIEEE (p./pp.2278--2324). PREVIOUS NEXT RelevantProjects MachineLearningProjects DataScienceProjects PythonProjectsforDataScience DataScienceProjectsinR MachineLearningProjectsforBeginners DeepLearningProjects NeuralNetworkProjects TensorflowProjects NLPProjects KaggleProjects IoTProjects BigDataProjects HadoopReal-TimeProjectsExamples SparkProjects DataAnalyticsProjectsforStudents Youmightalsolike DataScientistSalary HowtoBecomeaDataScientist DataAnalystvsDataScientist DataScientistResume DataScienceProjectsforBeginners MachineLearningEngineer MachineLearningProjectsforBeginners Datasets PandasDataframe MachineLearningAlgorithms RegressionAnalysis MNISTDataset DataScienceInterviewQuestions PythonDataScienceInterviewQuestions SparkInterviewQuestions HadoopInterviewQuestions DataAnalystInterviewQuestions MachineLearningInterviewQuestions AWSvsAzure HadoopArchitecture SparkArchitecture Tutorials DataScienceTutorial JupyterNotebookTutorial-ACompleteBeginnersGuide BestPythonNumPyTutorialforBeginners TableauTutorialforBeginners-StepbyStepGuide MLOpsPythonTutorialforBeginners-GetStartedwithMLOps AlteryxTutorialforBeginnerstoMasterAlteryxin2021 FreeMicrosoftPowerBITutorialforBeginnerswithExamples TheanoDeepLearningTutorialforBeginners ComputerVisionTutorialforBeginners|LearnComputerVision PythonPandasTutorialforBeginners-TheA-ZGuide NumPyPythonTutorialforBeginners HadoopOnlineTutorial–HadoopHDFSCommandsGuide MapReduceTutorial–LearntoimplementHadoopWordCountExample HadoopHiveTutorial-UsageofHiveCommandsinHQL HiveTutorial-GettingStartedwithHiveInstallationonUbuntu LearnJavaforHadoopTutorial:InheritanceandInterfaces LearnJavaforHadoopTutorial:ClassesandObjects LearnJavaforHadoopTutorial:Arrays ApacheSparkTutorial-RunyourFirstSparkProgram BestPySparkTutorialforBeginners-LearnSparkwithPython RTutorial-LearnDataVisualizationwithRusingGGVIS NeuralNetworkTrainingTutorial PythonListTutorial MatPlotLibTutorial DecisionTreeTutorial NeuralNetworkTutorial PerformanceMetricsforMachineLearningAlgorithms RTutorial:Data.Table SciPyTutorial Step-by-StepApacheSparkInstallationTutorial IntroductiontoApacheSparkTutorial RTutorial:ImportingDatafromWeb RTutorial:ImportingDatafromRelationalDatabase RTutorial:ImportingDatafromExcel IntroductiontoMachineLearningTutorial MachineLearningTutorial:LinearRegression MachineLearningTutorial:LogisticRegression SupportVectorMachineTutorial(SVM) K-MeansClusteringTutorial dplyrManipulationVerbs Introductiontodplyrpackage ImportingDatafromFlatFilesinR PrincipalComponentAnalysisTutorial PandasTutorialPart-3 PandasTutorialPart-2 PandasTutorialPart-1 Tutorial-HadoopMultinodeClusterSetuponUbuntu DataVisualizationsToolsinR RStatisticalandLanguagetutorial IntroductiontoDataSciencewithR ApachePigTutorial:UserDefinedFunctionExample ApachePigTutorialExample:WebLogServerAnalytics ImpalaCaseStudy:WebTraffic ImpalaCaseStudy:FlightDataAnalysis HadoopImpalaTutorial ApacheHiveTutorial:Tables FlumeHadoopTutorial:TwitterDataExtraction FlumeHadoopTutorial:WebsiteLogAggregation HadoopSqoopTutorial:ExampleDataExport HadoopSqoopTutorial:ExampleofDataAggregation ApacheZookepeerTutorial:ExampleofWatchNotification ApacheZookepeerTutorial:CentralizedConfigurationManagement HadoopZookeeperTutorialforBeginners HadoopSqoopTutorial HadoopPIGTutorial HadoopOozieTutorial HadoopNoSQLDatabaseTutorial HadoopHiveTutorial HadoopHDFSTutorial HadoophBaseTutorial HadoopFlumeTutorial Hadoop2.0YARNTutorial HadoopMapReduceTutorial BigDataHadoopTutorialforBeginners-HadoopInstallation Top15LatestRecipes Useoffindnextsiblingfunctioninbeautifulsoup Useoffindnextsiblingsfunctioninbeautifulsoup Howtocopybeautifulsoupobject Useofdescendantsfunctionbeautifulsoup Useofpreviouselementsfunctioninbeautifulsoup Useofnextelementsiteratorinbeautifulsoup Useofpreviouselementfunctioninbeautifulsoup Useofnextelementfunctioninbeautifulsoup Useofprevioussiblingsfunctioninbeautifulsoup Useofnextsiblingsfunctioninbeautifulsoup Useofprevioussiblingfunctioninbeautifulsoup Useofnextsiblingfunctioninbeautifulsoup Useofparentsfunctioninbeautifulsoup Useofparentfunctioninbeautifulsoup Useofstrippedstringsfunctioninbeautifulsoup GetFreedemoofAWSproject × GetDemoNow CONTINUE DownloadtheInterviewGuide × Go,GetThatDreamJob! CONTINUE
延伸文章資訊
- 16 Types of Artificial Neural Networks Currently Being Used in ML
6 Types of Artificial Neural Networks Currently Being Used in Machine Learning · 1. Feedforward N...
- 25 Different Types of Neural Networks - ProjectPro
Autoencoders: These are a special kind of neural network that consists of three main parts: encod...
- 3Deep Neural Network: The 3 Popular Types (MLP, CNN and ...
Multi-Layer Perceptrons (MLP); Convolutional Neural Networks (CNN); Recurrent Neural Networks (RN...
- 4Types of Neural Networks and Definition of ... - Great Learning
The three most important types of neural networks are: Artificial Neural Networks (ANN); Convolut...
- 5Types of artificial neural networks - Wikipedia
Feedforward networks can be constructed with various types of units, such as binary McCulloch–Pit...