0
点赞
收藏
分享

微信扫一扫

Lecture1: Natural Language Processing with Deep Learning CS224N/Ling284 [Chris Manning]

ZGtheGreat 2022-02-21 阅读 61

文章目录

Terminology

  • word vector: 词向量
  • Word2vec: word to vector
  • objective function gradients: 目标函数梯度
  • optimization basics: 优化基础
  • effective modern methods for deep learning: 深度学习有效的现代方法

**Key learning today(今天的学习重点):
The (really surprising!) result is that word meaning can be represented rather well by a (high-dimensional) vector of real numbers.

真的令人惊讶的结果表明,用高维的实数向量可以很好地表示单词的意思。**

rather well: 相当好
real numbers: 实数

Lecture 1: Introduction and Word Vectors (80mins in total)

The course(12.5%)

Course logistics in brief 课程组织工作简介

  • Instructor: Christopher Manning (aka. Chris Manning)
  • Head TA: Anna Goldie
  • Coordinator(协调人,班主任): Amelie Byun
  • TAs: Many wonderful people! See website
  • Time: Tu/Th 3:15-4:45 Pacific time, Zoom U.(->video)
  • We’ve put a lot of other important information on the class webpage. Please read it!
    http://cs224n.stanford.edu/
    a.k.a.,http://www.stanford.edu/class/cs224n/
  • TAs, syllabus (教学大纲), help sessions(帮助会话)/office hours, Ed(for all course questions/discussion)
  • Office hours start Thursday evening!
  • Python/numpy and then PyTorch tutorials: First two Fridays 1:30-2:30 Pacific time on Zoom U.
  • Slide PDFs uploaded before each lecture

What do we hope to teach? (A.k.a. “learning goals”) 我们想要教什么?也就是,学习目标。

  1. The foundations of the (effective modern methods for deep learning) applied to NLP (应用于自然语言处理的有效的现代深度学习方法的基础) Basics first, then key methods used in NLP: Word vectors, feed-forward networks, recurrent networks, attention, encoder-decoder models, transformers, etc. 首先是基础,然后是在NLP中使用的关键方法,如:词向量,前馈网络,循环网络,注意力,编码器-解码器模型,transformers,等。
  2. A big picture understanding of human languages and the difficulties in understanding and producing them via computers. (对人类语言的全局理解,通过计算机理解和生成人类语言的困难)
  3. An understanding of and ability to build systems (in pytorch) for some of the major problems in NLP: Word meaning, dependency parsing, machine translation, question answering. 对于在NLP中的一些主要的问题,对系统的理解和使用pytorch构建该系统的能力。主要问题有:词的意思,依存分析,机器翻译,问答系统。
  4. Course work and grading policy
  • 5x1-week Assignments: 6%+4x12%: 54%
    • HW1 is released today! Due next Tuesday! At 3:15 p.m.
    • Submitted to Gradescope in Canvas (i.e., using @stanford.edu email for your Gradescope account)
  • Final Default or Custom Course Project (1-3 people): 43%
    • Project proposal: 5%, milestone: 5%, poster or web summary: 3%, report: 30%
  • Participation: 3%
    • Guest lecture reactions, Ed, (???) course evals, karma - see website!
  • Late day policy
    • 6 free late days; afterwards, 1% off course grade per day late
    • assignments not accepted more than 3 days late per assignment unless given permission in advance
  • Collaboration policy: Please read the website and the Honor Code! Understand allowed collaboration and how to document it: Don’t take code off the web; acknowledge working with other students; write your own assignment solutions.

Human language and word meaning (18.75%)

Word2vec introduction(18.75%)

Word2vec objective function gradients 目标函数梯度(31.25%)

Optimization basics 优化基础(6.25%)

Looking at word vectors(12.5%)

举报

相关推荐

0 条评论