Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

Three principles for data science: predictability, stability, and computability

Автор: Berkeley Institute for Data Science (BIDS)

Загружено: 2017-09-12

Просмотров: 1812

Описание:

Speaker: Bin Yu, Chancellor’s Professor of Statistics at the University of California at Berkeley

Berkeley Distinguished Lectures in Data Science, Fall 2017
https://bids.berkeley.edu/news/berkel...

Title: Three principles for data science: predictability, stability, and computability

Date: September 12, 2017
Time: 4:10pm to 5:00pm
Locations: BIDS, 190 Doe Library, UC Berkeley

ABSTRACT

In this talk, I'd like to discuss the intertwining importance and connections of three principles of data science in the title in data-driven decisions. Making prediction as its central task and embracing computation as its core, machine learning has enabled wide-ranging data-driven successes. Prediction is a useful way to check with reality. Good prediction implicitly assumes stability between past and future. Stability (relative to data and model perturbations) is also a minimum requirement for interpretability and reproducibility of data driven results (cf. Yu, 2013). It is closely related to uncertainty assessment. Obviously, both prediction and stability principles can not be employed without feasible computational algorithms, hence the importance of computability.

The three principles will be demonstrated in the context of two neuroscience collaborative projects with the Gallant Lab and through analytical connections. In particular, the first project adds stability to predictive modeling used for reconstruction of movies from fMRI brain signlas to gain interpretability of the predictive model. The second project uses predictive transfer learning that combines AlexNet, GoogleNet and VGG with single V4 neuron data for state-of-the-art prediction performance. Moreover, it provides stable function characterization of neurons via (manifold) deep dream images from the predictive models in the difficult primate visual cortex V4. Our V4 results lend support, to a certain extent, to the resemblance of these CNNs to a primate brain.



SPEAKER

Bin Yu is Chancellor’s Professor in the Departments of Statistics and of Electrical Engineering & Computer Science at the University of California at Berkeley and a former Chair of Statistics at Berkeley. She is founding co-director of the Microsoft Joint Lab at Peking University on Statistics and Information Technology. Her group at Berkeley is engaged in interdisciplinary research with scientists from genomics, neuroscience, and medicine. In order to solve data problems in these domain areas, her group employs quantitative critical thinking and develops statistical and machine learning algorithms and theory. She has published more than 100 scientific papers in premier journals in statistics, machine learning, information theory, signal processing, remote sensing, neuroscience, genomics, and networks.

She is a member of the U.S. National Academy of Sciences and fellow of the American Academy of Arts and Sciences. She was a Guggenheim Fellow in 2006, an invited speaker at ICIAM in 2011, the Tukey Memorial Lecturer of the Bernoulli Society in 2012, and an invited speaker at the Rietz Lecture of Institute of Mathematical Statistics (IMS) in 2016. She was IMS president in 2013–2014, and she is a fellow of IMS, ASA, AAAS, and IEEE. She has served or is serving on leadership committees of NAS-BMSA, SAMSI, IPAM, and ICERM and on editorial boards for the Journal of Machine Learning, Annals of Statistics, and Annual Review of Statistics.


BERKELEY DISTINGUISHED LECTURES IN DATA SCIENCE
https://bids.berkeley.edu/news/berkel...

The Berkeley Distinguished Lectures in Data Science, co-hosted by the Berkeley Institute for Data Science (BIDS) and the Berkeley Division of Data Sciences, features faculty doing visionary research that illustrates the character of the ongoing data, computational, inferential revolution. In this inaugural Fall 2017 "local edition," we bring forward Berkeley faculty working in these areas as part of enriching the active connections among colleagues campus-wide. All campus community members are welcome and encouraged to attend. Arrive at 3:30pm for tea, coffee, and discussion.

Three principles for data science: predictability, stability, and computability

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

Microsoft Azure and Microsoft Research: advancing cloud computing together

Microsoft Azure and Microsoft Research: advancing cloud computing together

Letters of recommendation in Berkeley undergraduate admissions

Letters of recommendation in Berkeley undergraduate admissions

LLM и GPT - как работают большие языковые модели? Визуальное введение в трансформеры

LLM и GPT - как работают большие языковые модели? Визуальное введение в трансформеры

Fully Homomorphic Encryption, 10 Years Later: Definitions and Open Problems

Fully Homomorphic Encryption, 10 Years Later: Definitions and Open Problems

Seminar with Hannes Bajohr

Seminar with Hannes Bajohr "Artificial and Post-Artificial Texts: The Reader’s Expectation after AI"

Как сжимаются изображения? [46 МБ ↘↘ 4,07 МБ] JPEG в деталях

Как сжимаются изображения? [46 МБ ↘↘ 4,07 МБ] JPEG в деталях

4 Hours Chopin for Studying, Concentration & Relaxation

4 Hours Chopin for Studying, Concentration & Relaxation

Учебник по Excel за 15 минут

Учебник по Excel за 15 минут

Понимание Active Directory и групповой политики

Понимание Active Directory и групповой политики

50 Best of Bach

50 Best of Bach

Понимание GD&T

Понимание GD&T

Градиентный спуск, как обучаются нейросети | Глава 2, Глубинное обучение

Градиентный спуск, как обучаются нейросети | Глава 2, Глубинное обучение

Dacher Keltner: Why Awe Is Such an Important Emotion

Dacher Keltner: Why Awe Is Such an Important Emotion

Seminar with Stéfan van der Walt

Seminar with Stéfan van der Walt "Scientific Python: Community, Tools, and Open Science"

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Kenneth A. Ribet,

Kenneth A. Ribet, "A 2020 View of Fermat's Last Theorem"

Dominik Rothenhäusler

Dominik Rothenhäusler "Estimation and inference under random distributional shifts"

Bach - Classical Music for Relaxation

Bach - Classical Music for Relaxation

Как LLM могут хранить факты | Глава 7, Глубокое обучение

Как LLM могут хранить факты | Глава 7, Глубокое обучение

Learning from Dynamics

Learning from Dynamics

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]