2019-2020 2020-2021 2021-2022 2022-2023 2023-2024
Browse
by subject...
    Schedule
view...
 

1 - 10 of 21 results for: CS229

CS 81SI: AI Interpretability and Fairness

As black-box AI models grow increasingly relevant in human-centric applications, explainability and fairness becomes increasingly necessary for trust in adopting AI models. This seminar class introduces students to major problems in AI explainability and fairness, and explores key state-of-theart methods. Key technical topics include surrogate methods, feature visualization, network dissection, adversarial debiasing, and fairness metrics. There will be a survey of recent legal and policy trends. Each week a guest lecturer from AI research, industry, and related policy fields will present an open problem and solution, followed by a roundtable discussion with the class. Students have the opportunity to present a topic of interestnor application to their own projects (solo or in teams) in the final class. Code examples of each topic will be provided for students interested in a particular topic, but there will be no required coding components. Students who will benefit most from this class have exposure to AI, such as through projects and related coursework (e.g. statistics, CS221, CS230, CS229). Students who are pursuing subjects outside of the CS department (e.g. sciences, social sciences, humanities) with sufficient mathematical maturity are welcomed to apply. Enrollment limited to 20.
Last offered: Spring 2020

CS 224N: Natural Language Processing with Deep Learning (LINGUIST 284, SYMSYS 195N)

Methods for processing human language information and the underlying computational properties of natural languages. Focus on deep learning approaches: understanding, implementing, training, debugging, visualizing, and extending neural network models for a variety of language understanding tasks. Exploration of natural language tasks ranging from simple word level and syntactic processing to coreference, question answering, and machine translation. Examination of representative papers and systems and completion of a final project applying a complex neural network model to a large-scale NLP problem. Prerequisites: calculus and linear algebra; CS124, CS221, or CS229.
Terms: Win, Spr | Units: 3-4

CS 224S: Spoken Language Processing (LINGUIST 285)

Introduction to spoken language technology with an emphasis on dialogue and conversational systems. Deep learning and other methods for automatic speech recognition, speech synthesis, affect detection, dialogue management, and applications to digital assistants and spoken language understanding systems. Prerequisites: CS124, CS221, CS224N, or CS229.
Terms: Spr | Units: 2-4
Instructors: Maas, A. (PI)

CS 229: Machine Learning (STATS 229)

Topics: statistical pattern recognition, linear and non-linear regression, non-parametric methods, exponential family, GLMs, support vector machines, kernel methods, deep learning, model/feature selection, learning theory, ML advice, clustering, density estimation, EM, dimensionality reduction, ICA, PCA, reinforcement learning and adaptive control, Markov decision processes, approximate dynamic programming, and policy search. Prerequisites: knowledge of basic computer science principles and skills at a level sufficient to write a reasonably non-trivial computer program in Python/NumPy to the equivalency of CS106A, CS106B, or CS106X, familiarity with probability theory to the equivalency of CS 109, MATH151, or STATS 116, and familiarity with multivariable calculus and linear algebra to the equivalency of MATH51 or CS205.
Terms: Aut, Win | Units: 3-4

CS 229B: Machine Learning for Sequence Modeling (STATS 232)

Sequence data and time series are becoming increasingly ubiquitous in fields as diverse as bioinformatics, neuroscience, health, environmental monitoring, finance, speech recognition/generation, video processing, and natural language processing. Machine learning has become an indispensable tool for analyzing such data; in fact, sequence models lie at the heart of recent progress in AI like GPT3. This class integrates foundational concepts in time series analysis with modern machine learning methods for sequence modeling. Connections and key differences will be highlighted, as well as how grounding modern neural network approaches with traditional interpretations can enable powerful leaps forward. You will learn theoretical fundamentals, but the focus will be on gaining practical, hands-on experience with modern methods through real-world case studies. You will walk away with a broad and deep perspective of sequence modeling and key ways in which such data are not just 1D images.
Terms: Aut | Units: 3-4
Instructors: Fox, E. (PI)

CS 229M: Machine Learning Theory (STATS 214)

How do we use mathematical thinking to design better machine learning methods? This course focuses on developing mathematical tools for answering this question. This course will cover fundamental concepts and principled algorithms in machine learning, particularly those that are related to modern large-scale non-linear models. The topics include concentration inequalities, generalization bounds via uniform convergence, non-convex optimization, implicit regularization effect in deep learning, and unsupervised learning and domain adaptations. Prerequisites: linear algebra ( MATH 51 or CS 205), probability theory ( STATS 116, MATH 151 or CS 109), and machine learning ( CS 229, STATS 229, or STATS 315A).
Terms: Aut | Units: 3

CS 229S: Systems for Machine Learning

Deep learning and neural networks are being increasingly adopted across industries. They are now used to serve billions of users across applications such as search, knowledge discovery, and productivity assistants. As models become more capable and intelligent, this trend of large-scale adoption will continue to grow rapidly. Due to the widespread application, there is an increasing need to achieve high performance for both training and serving deep-learning models. However, performance is hindered by a multitude of infrastructure and lifecycle hurdles - the increasing complexity of the models, massive sizes of training and inference data, heterogeneity of the available accelerators and multi-node platforms, and diverse network properties. The slow adaptation of systems to new algorithms creates a bottleneck for the rapid evolution of deep-learning models and their applications. This course will cover systems approaches for improving the efficiency of machine learning pipelines - comprising data preparation, model training, and model deployment & inference -at each level of the systems stack spanning software and hardware.
Terms: Aut | Units: 3

CS 236G: Generative Adversarial Networks

Generative Adversarial Networks (GANs) have rapidly emerged as the state-of-the-art technique in realistic image generation. This course presents theoretical intuition and practical knowledge on GANs, from their simplest to their state-of-the-art forms. Their benefits and applications span realistic image editing that is omnipresent in popular app filters, enabling tumor classification under low data schemes in medicine, and visualizing realistic scenarios of climate change destruction. This course also examines key challenges of GANs today, including reliable evaluation, inherent biases, and training stability. After this course, students should be familiar with GANs and the broader generative models and machine learning contexts in which these models are situated. Prerequisites: linear algebra, statistics, CS106B, plus a graduate-level AI course such as: CS230, CS229 (or CS129), or CS221.
Last offered: Winter 2022

CS 281: Ethics of Artificial Intelligence

Machine learning has become an indispensable tool for creating intelligent applications, accelerating scientific discoveries, and making better data-driven decisions. Yet, the automation and scaling of such tasks can have troubling negative societal impacts. Through practical case studies, you will identify issues of fairness, justice and truth in AI applications. You will then apply recent techniques to detect and mitigate such algorithmic biases, along with methods to provide more transparency and explainability to state-of-the-art ML models. Finally, you will derive fundamental formal results on the limits of such techniques, along with tradeoffs that must be made for their practical application. CS229 or equivalent classes or experience.
Terms: Spr | Units: 3-4
Instructors: Guestrin, C. (PI)

CS 329D: Machine Learning Under Distributional Shifts

The progress of machine learning systems has seemed remarkable and inexorable a wide array of benchmark tasks including image classification, speech recognition, and question answering have seen consistent and substantial accuracy gains year on year. However, these same models are known to fail consistently on atypical examples and domains not contained within the training data. The goal of the course is to introduce the variety of areas in which distributional shifts appear, as well as provide theoretical characterization and learning bounds for distribution shifts. Prerequisites: CS229 or equivalent. Recommended: CS229T (or basic knowledge of learning theory).
Last offered: Spring 2023
Filter Results:
term offered
updating results...
teaching presence
updating results...
number of units
updating results...
time offered
updating results...
days
updating results...
UG Requirements (GERs)
updating results...
component
updating results...
career
updating results...
© Stanford University | Terms of Use | Copyright Complaints