10 Applications of Artificial Neural Networks in Natural ...
文章推薦指數: 80 %
As we showed, neural networks have many applications such as text classification, information extraction, semantic parsing, question answering, ... HomeNotificationsListsStoriesWrite10ApplicationsofArtificialNeuralNetworksinNaturalLanguageProcessingbyOlgaDavydovaSinceartificialneuralnetworksallowmodelingofnonlinearprocesses,theyhaveturnedintoaverypopularandusefultoolforsolvingmanyproblemssuchasclassification,clustering,regression,patternrecognition,dimensionreduction,structuredprediction,machinetranslation,anomalydetection,decisionmaking,visualization,computervision,andothers.Thiswiderangeofabilitiesmakesitpossibletouseartificialneuralnetworksinmanyareas.Inthisarticle,wediscussapplicationsofartificialneuralnetworksinNaturalLanguageProcessingtasks(NLP).NLPincludesawidesetofsyntax,semantics,discourse,andspeechtasks.Wewilldescribeprimetasksinwhichneuralnetworksdemonstratedstate-of-the-artperformance.1.TextClassificationandCategorizationTextclassificationisanessentialpartinmanyapplications,suchaswebsearching,informationfiltering,languageidentification,readabilityassessment,andsentimentanalysis.Neuralnetworksareactivelyusedforthesetasks.InConvolutionalNeuralNetworksforSentenceClassificationbyYoonKim,aseriesofexperimentswithConvolutionalNeuralNetworks(CNN)builtontopofword2vecwaspresented.Thesuggestedmodelwastestedagainstseveralbenchmarks.InMovieReviews(MR)andCustomerReviews(CR),thetaskwastodetectpositive/negativesentiment.InStanfordSentimentTreebank(SST-1),therewerealreadymoreclassestopredict:verypositive,positive,neutral,negative,verynegative.InSubjectivitydataset(Subj),sentenceswereclassifiedintotwotypes,subjectiveorobjective.InTRECthegoalwastoclassifyaquestionintosixquestiontypes(whetherthequestionisaboutperson,location,numericinformation,etc.)Theresultsofnumeroustestsdescribedinthepapershowthatafterlittletuningofhyperparametersthemodelperformsexcellentsuggestingthatthepre-trainedvectorsareuniversalfeatureextractorsandcanbeutilizedforvariousclassificationtasks[1].ThearticleTextUnderstandingfromScratchbyXiangZhangandYannLeCunshowsthatit’spossibletoapplydeeplearningtotextunderstandingfromcharacter-levelinputsallthewayuptoabstracttextconceptswithhelpoftemporalConvolutionalNetworks(ConvNets)(CNN).Here,theauthorsassertthatConvNetscanachieveexcellentperformancewithouttheknowledgeofwords,phrases,sentencesandanyothersyntacticorsemanticstructureswithregardstoahumanlanguage[2].Toprovetheirassertionseveralexperimentswereconducted.ThemodelwastestedontheDBpediaontologyclassificationdatasetwith14classes(company,educationalinstitution,artist,athlete,officeholder,meanoftransportation,building,naturalplace,village,animal,plant,album,film,writtenwork).Theresultsindicatebothgoodtraining(99.96%)andtesting(98.40%)accuracy,withsomeimprovementfromthesaurusaugmentation.Inaddition,thesentimentanalysistestwasperformedontheAmazonReviewdataset.Inthisstudy,theresearchersconstructedasentimentpolaritydatasetwithtwonegativeandtwopositivelabels.Theresultis97.57%trainingaccuracyand95.07%testingaccuracy.ThemodelwasalsotestedonYahoo!AnswersComprehensiveQuestionsandAnswersdatasetwith10classes(Society&Culture,Science&Mathematics,Health,Education&Reference,Computers&Internet,Sports,Business&Finance,Entertainment&Music,Family&Relationships,Politics&Government)andonAG’scorpuswherethetaskwasanewscategorizationintofourcategories(World,Sports,Business,Sci/Tech.).ObtainedresultsconfirmthattoachievegoodtextunderstandingConvNetsrequirealargecorpusinordertolearnfromscratch.SiweiLai,LihengXu,KangLiu,andJunZhaointroducedrecurrentconvolutionalneuralnetworksfortextclassificationwithouthuman-designedfeaturesintheirdocumentRecurrentConvolutionalNeuralNetworksforTextClassification[3].Theteamtestedtheirmodelusingfourdatasets:20Newsgroup(withfourcategoriessuchascomputers,politics,recreation,andreligion),FudanSet(aChinesedocumentclassificationsetthatconsistsof20classes,includingart,education,andenergy),ACLAnthologyNetwork(withfivelanguages:English,Japanese,German,Chinese,andFrench),andSentimentTreebank(withVeryNegative,Negative,Neutral,Positive,andVeryPositivelabels).Aftertesting,themodelwascomparedtoexistingtextclassificationmethodslikeBagofWords,Bigrams+LR,SVM,LDA,TreeKernels,RecursiveNN,andCNN.Itturnedoutthatneuralnetworkapproachesoutperformtraditionalmethodsforallfourdatasets,andtheproposedmodeloutperformsCNNandRecursiveNN.2.NamedEntityRecognition(NER)Themaintaskofnamedentityrecognition(NER)istoclassifynamedentities,suchasGuidovanRossum,Microsoft,London,etc.,intopredefinedcategorieslikepersons,organizations,locations,time,dates,andsoon.ManyNERsystemswerealreadycreated,andthebestofthemuseneuralnetworks.Inthepaper,NeuralArchitecturesforNamedEntityRecognition,twomodelsforNERwereproposed.Themodelsrequirecharacter-basedwordrepresentationslearnedfromthesupervisedcorpusandunsupervisedwordrepresentationslearnedfromunannotatedcorpora[4].NumeroustestswerecarriedonusingdifferentdatasetslikeCoNLL-2002andCoNLL-2003inEnglish,Dutch,German,andSpanishlanguages.Theteamconcludedthatwithoutarequirementofanylanguage-specificknowledgeorresources,suchasgazetteers,theirmodelsshowstate-of-the-artperformanceinNER.3.Part-of-SpeechTaggingPart-of-speech(POS)tagginghasmanyapplicationsincludingparsing,text-to-speechconversion,informationextraction,andsoon.Inthework,Part-of-SpeechTaggingwithBidirectionalLongShort-TermMemoryRecurrentNeuralNetworkarecurrentneuralnetworkwithwordembeddingforpart-of-speech(POS)taggingtaskispresented[5].ThemodelwastestedontheWallStreetJournaldatafromPennTreebankIIIdatasetandachievedaperformanceof97.40%taggingaccuracy.4.SemanticParsingandQuestionAnsweringQuestionAnsweringsystemsautomaticallyanswerdifferenttypesofquestionsaskedinnaturallanguagesincludingdefinitionquestions,biographicalquestions,multilingualquestions,andsoon.Neuralnetworksusagemakesitpossibletodevelophighperformingquestionansweringsystems.InSemanticParsingviaStagedQueryGraphGenerationQuestionAnsweringwithKnowledgeBaseWen-tauYih,Ming-WeiChang,XiaodongHe,andJianfengGaodescribedthedevelopedsemanticparsingframeworkforquestionansweringusingaknowledgebase.Authorssaytheirmethodusestheknowledgebaseatanearlystagetoprunethesearchspaceandthussimplifiesthesemanticmatchingproblem[6].Italsoappliesanadvancedentitylinkingsystemandadeepconvolutionalneuralnetworkmodelthatmatchesquestionsandpredicatesequences.ThemodelwastestedonWebQuestionsdataset,anditoutperformspreviousmethodssubstantially.5.ParaphraseDetectionParaphrasedetectiondetermineswhethertwosentenceshavethesamemeaning.Thistaskisespeciallyimportantforquestionansweringsystemssincetherearemanywaystoaskthesamequestion.DetectingSemanticallyEquivalentQuestionsinOnlineUserForumssuggestsamethodforidentifyingsemanticallyequivalentquestionsbasedonaconvolutionalneuralnetwork.TheexperimentsareperformedusingtheAskUbuntuCommunityQuestionsandAnswers(Q&A)siteandMetaStackExchangedata.ItwasshownthattheproposedCNNmodelachieveshighaccuracyespeciallywhenthewordsembeddedarepre-trainedonin-domaindata.Theauthorscomparedtheirmodel’sperformancewithSupportVectorMachinesandaduplicatedetectionapproach.TheydemonstratedthattheirCNNmodeloutperformsthebaselinesbyalargemargin[7].Inthestudy,ParaphraseDetectionUsingRecursiveAutoencoder,anovelrecursiveautoencoderarchitectureispresented.Itlearnsphrasalrepresentationsusingrecursiveneuralnetworks.Theserepresentationsarevectorsinann-dimensionalsemanticspacewherephraseswithsimilarmeaningsareclosetoeachother[8].Forevaluatingthesystem,theMicrosoftResearchParaphraseCorpusandEnglishGigawordCorpuswereused.Themodelwascomparedtothreebaselines,anditoutperformsthemall.6.LanguageGenerationandMulti-documentSummarizationNaturallanguagegenerationhasmanyapplicationssuchasautomatedwritingofreports,generatingtextsbasedonanalysisofretailsalesdata,summarizingelectronicmedicalrecords,producingtextualweatherforecastsfromweatherdata,andevenproducingjokes.Inarecentpaper,NaturalLanguageGeneration,ParaphrasingandSummarizationofUserReviewswithRecurrentNeuralNetworks,researchersdescribearecurrentneuralnetwork(RNN)modelcapableofgeneratingnovelsentencesanddocumentsummaries.Thepaperdescribedandevaluatedadatabaseof820,000consumerreviewsintheRussianlanguage.Thedesignofthenetworkpermitsuserscontrolofthemeaningofgeneratedsentences.Bychoosingsentence-levelfeaturesvector,itispossibletoinstructthenetwork;forexample,“Saysomethinggoodaboutascreenandsoundqualityinabouttenwords”[9].Theabilityoflanguagegenerationallowsproductionofabstractivesummariesofmultipleuserreviewsthatoftenhavereasonablequality.Usually,thesummaryreportmakesitpossibleforuserstoquicklyobtaintheinformationcontainedinalargeclusterofdocuments.7.MachineTranslationMachinetranslationsoftwareisusedaroundtheworlddespiteitslimitations.Insomedomains,thequalityoftranslationisnotgood.Toimprovetheresultsresearcherstrydifferenttechniquesandmodels,includingtheneuralnetworkapproach.ThepurposeofNeural-basedMachineTranslationforMedicalTextDomainstudyistoinspecttheeffectsofdifferenttrainingmethodsonaPolish-Englishmachinetranslationsystemusedformedicaldata.Totrainneuralandstatisticalnetwork-basedtranslationsystemsTheEuropeanMedicinesAgencyparalleltextcorpuswasused.Itwasdemonstratedthataneuralnetworkrequiresfewerresourcesfortrainingandmaintenance.Inaddition,aneuralnetworkoftensubstitutedwordswithotherwordsoccurringinasimilarcontext[10].8.SpeechRecognitionSpeechrecognitionhasmanyapplications,suchashomeautomation,mobiletelephony,virtualassistance,hands-freecomputing,videogames,andsoon.Neutralnetworksarewidelyusedinthisarea.InConvolutionalNeuralNetworksforSpeechRecognition,scientistsexplainhowtoapplyCNNstospeechrecognitioninanovelway,suchthattheCNN’sstructuredirectlyaccommodatessometypesofspeechvariabilitylikevaryingspeakingrate[11].TIMITphonerecognitionandalarge-vocabularyvoicesearchtaskswereused.9.CharacterRecognitionCharacterRecognitionsystemsalsohavenumerousapplicationslikereceiptcharacterrecognition,invoicecharacterrecognition,checkcharacterrecognition,legalbillingdocumentcharacterrecognition,andsoon.ThearticleCharacterRecognitionUsingNeuralNetworkpresentsamethodfortherecognitionofhandwrittencharacterswith85%accuracy[12].10.SpellCheckingMosttexteditorsletuserscheckiftheirtextcontainsspellingmistakes.Neuralnetworksarenowincorporatedintomanyspell-checkingtools.InPersonalizedSpellCheckingusingNeuralNetworksanewsystemfordetectingmisspelledwordswasproposed.Thissystemistrainedonobservationsofthespecificcorrectionsthatatypistmakes[13].Itoutwitsmanyoftheshortcomingsthattraditionalspell-checkingmethodshave.SummaryInthisarticle,wedescribedNaturalLanguageProcessingproblemsthatcanbesolvedusingneuralnetworks.Asweshowed,neuralnetworkshavemanyapplicationssuchastextclassification,informationextraction,semanticparsing,questionanswering,paraphrasedetection,languagegeneration,multi-documentsummarization,machinetranslation,andspeechandcharacterrecognition.Inmanycases,neuralnetworksmethodsoutperformothermethods.Resources1.http://www.aclweb.org/anthology/D14-11812.https://arxiv.org/pdf/1502.01710.pdf3.https://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/view/9745/95524.http://www.aclweb.org/anthology/N16-10305.https://arxiv.org/pdf/1510.06168.pdf6.http://www.aclweb.org/anthology/P15-11287.https://www.aclweb.org/anthology/K15-10138.https://nlp.stanford.edu/courses/cs224n/2011/reports/ehhuang.pdf9.http://www.meanotek.ru/files/TarasovDS(2)2015-Dialogue.pdf10.http://www.sciencedirect.com/science/article/pii/S187705091502591011.https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/CNN_ASLPTrans2-14.pdf12.http://www.ijettjournal.org/volume-4/issue-4/IJETT-V4I4P230.pdf13.http://www.cs.umb.edu/~marc/pubs/garaas_xiao_pomplun_HCII2007.pdfMorefromDataMonstersFollowhttps://datamonsters.comLovepodcastsoraudiobooks?Learnonthegowithournewapp.TryKnowableMorefromMediumAI-basedIndianlicenseplatedetector.Inspiration:Theguywhohitmycarandgotawaywithit!PredictingHouseEnergyUsageinPytorchThisarticlewilllookattheenergyrequiredtochangehousetemperaturerelativetoitsdesign.Thedatasetusedinthisprojectisfrom…DBSCANClusteringAlgorithm — HowtoBuildPowerfulDensity-BasedModelsTopexplanationmethodsformedicalimagesInthisarticle,wewillgiveaquickintroductiontoexplanationmethods(XAI)andthenreviewtheexplanationmethodsusedtoclassify…SimpleRegressionusingDeepNeuralNetworkThereareplentyofcomplexneuralnetworkexamplesouttheretoexplore,butitisalwaysbettertostartfromthebasicsasitgivesyou…ImageSegmentation:FCNetWhatisSemanticSegmentation?Whyisitimportant?InterpretingaRegressionModelResultInterpretingaregressionmodelresulthasbeenquiteataskforme.NowIhavewrittenasmallscriptinpythonforstep-wiseapproachon…TakingontheKaggleTaxiChallengeIgotanemailaboutKaggle’staxichallengeandnormallyIwouldbehesitanttojointhesecompetitionsbutthisoneseemedreally…GetstartedDataMonsters2.8KFollowershttps://datamonsters.comFollowRelatedABERTFlavortoSampleandSavorThankfully,wecanderiveavarietyofmodelsfromtheBERTarchitecturetofitourmemoryandlatencyneeds.Turnsout,modelcapacity…APrimeronCurrent&PastDeepLearningMethodsforNLPWhatisNLP(NaturalLanguageProcessing)?Whatisnaturallanguageprocessing?Well,you’redoingitrightnow,you’relisteningtothewordsandthesentencesthatI’mformingand…TheDangersofContext-InsensitivityinNLPThisisacrosspostfromtheofficialSurgeAIblog.IfyouneedhelpwithdatalabelingandNLP,sayhello!HelpStatusWritersBlogCareersPrivacyTermsAboutKnowable
延伸文章資訊
- 1Real-Life Applications of Neural Networks | Smartsheet
Neural networks are fundamental to deep learning, a robust set of NN techniques that lends itself...
- 2Neural Networks - Applications
- 310 Applications of Artificial Neural Networks in Natural ...
As we showed, neural networks have many applications such as text classification, information ext...
- 4Artificial Neural Network Applications - 4 Real World ...
2. Artificial Neural Network Applications · Handwriting Recognition – The idea of Handwriting rec...
- 5Applications of Neural Networks - Tutorialspoint
Why Artificial Neural Networks? · With the help of neural networks, we can find the solution of s...