查看原文
其他

推荐|约翰霍普金斯大学计算机科学系Jason Eisner教授总结了一系列的自然语言处理优秀教学资源

2017-05-29 全球人工智能

>>>>>>欢迎投稿:news@top25.cn<<<<<<

全球人工智能

下面是约翰霍普金斯大学计算机科学系Jason Eisner教授总结的一系列的自然语言处理优秀教学资源,希望对大家有用:


  • The difference between natural language processing (NLP) and computational linguistics (CL)

    http://www.quora.com/What-is-the-difference-between-natural-language-processing-and-computational-linguistics/answer/Jason-Eisner?share=1


  • The difference between AI, ML, and NLP

    http://www.quora.com/Whats-the-difference-between-Machine-Learning-AI-and-NLP/answer/Jason-Eisner?share=1


  • The difference between frequentists and Bayesians: A dialogue

    http://www.quora.com/For-a-non-expert-what-is-the-difference-between-Bayesian-and-frequentist-approaches/answer/Jason-Eisner?share=1


  • The three cultures of machine learning

    https://www.cs.jhu.edu/~jason/tutorials/ml-simplex.html


  • Probability crash course (+) — how to build simple probabilistic models

    http://videolectures.net/hltss2010_eisner_plm/video/1/


  • Interactive lessons in  — nifty visualization toy with sliders

    • Latent-variable log-linear modeling

      http://www.quora.com/What-is-the-latent-log-linear-model-with-latent-variables-and-how-do-you-train-such-a-model/answer/Jason-Eisner?share=1


  • Interactive visualization of kernel SVMs (by Guillaume Caron: I'm only hosting it

    https://www.cs.jhu.edu/~jason/tutorials/SVMApplet/



  • Lagrange multipliers — high-level explanation

    http://www.umiacs.umd.edu/~resnik/ling848_fa2004/lagrange.html


  • Variational inference — high-level explanation

    https://www.cs.jhu.edu/~jason/tutorials/variational.html


  • Belief propagation (slides) — high-level explanation; ACL 2014/2015 tutorial

    https://www.cs.jhu.edu/~jason/tutorials/bp.ppt

    http://www.cs.jhu.edu/~mrg/bp-tutorial/


  • Hidden Markov Models (video+more) — a fun detailed example

    https://www.cs.jhu.edu/~jason/papers/#eisner-2002-tnlp


  • Back-propagation (animated slides+video) — high-level explanation; also see suggested readings at top of 

    https://www.cs.jhu.edu/~jason/tutorials/backprop-pool.pptx


  • Understanding the inside-outside and forward-backward algorithms -- they're just backprop

    http://cs.jhu.edu/~jason/papers/#spnlp16



  • An annotated drawing of an LSTM unit (based on Graves 2012)

    https://www.cs.jhu.edu/~jason/tutorials/lstm.png

    https://www.cs.toronto.edu/~graves/preprint.pdf


  • Bayesian generative modeling (video+) — works up to topic models and Bayesian HMMs

    http://techtalks.tv/events/76/?page=2


  • Minimum spanning tree (tutorial paper) — deep and clear coverage of how 7 algorithms were designed

    http://cs.jhu.edu/~jason/papers/#eisner-1997-mst


  • Convert a formula from SAT to CNF-SAT (pseudocode and discussion)

    https://www.cs.jhu.edu/~jason/tutorials/convert-to-CNF.html


  • Competitive grammar writing exercise, with software

    https://www.cs.jhu.edu/~jason/papers/#smith-eisner-2008-cgw

热门文章推荐

独家|面对人工智能,下个像柯洁一样哭泣的可能就是你!

最新|扎克伯格:如何在被抛弃和被否定中成就自己?

震惊!kaggle禁止中国人参加zillow第二轮比赛!

重磅|揭秘AlphaGo2.0版本的技术设计和棋艺水平!

重磅|谁让英伟达一夜损失360亿?还留下一道思考题!

重磅|Google 扔下两枚核弹,炸掉无数独角兽数千亿美金!

重磅|振奋人心!图灵奖得主John Hopcroft教授加入北大!

重磅|英伟达疯狂冲击万亿市值!最大威胁来自Google!

您可能也对以下帖子感兴趣

文章有问题?点此查看未经处理的缓存