Skip to content

alexantonyarokiaraj/MachineLearningECT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

47 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Machine Learning and Data Analysis for Nuclear Physics, a Nuclear TALENT Course at the ECT*, Trento, Italy, June 22 to July 3 2020.

Why a course on Machine Learning for Nuclear Physics?

Probability theory and statistical methods play a central role in science. Nowadays we are surrounded by huge amounts of data. For example, there are about one trillion web pages; more than one hour of video is uploaded to YouTube every second, amounting to 10 years of content every day; the genomes of 1000s of people, each of which has a length of more than a billion base pairs, have been sequenced by various labs and so on. This deluge of data calls for automated methods of data analysis, which is exactly what machine learning provides. The purpose of this Nuclear Talent course is to provide an introduction to the core concepts and tools of machine learning in a manner easily understood and intuitive to physicists and nuclear physicists in particular. We will start with some of the basic methods from supervised learning and statistical data analysis, such as various regression methods before we move into deep learning methods for both supervised and unsupervised learning, with an emphasis on the analysis of nuclear physics experiments and theoretical nuclear physics. The students will work on hands-on daily examples as well as projects than can result in final credits. Exercises and projects will be provided and the aim is to give the participants an overview on how machine learning can be used to analyze and study nuclear physics problems (experiment and theory). The major scope is to give the participants a deeper understanding on what Machine learning and Data Analysis are and how they can be used to analyze data from nuclear physics experiments and perform theoretical calculations of nuclear many-body systems.

The goals of the Nuclear Talent course on Machine Learning and Data Analysis are to give the participants a deeper understanding and critical view of several widely popular Machine Learning algorithms, covering both supervised and unsupervised learning. The learning outcomes involve an understanding of the following central methods:

  • Basic concepts of machine learning and data analysis and statistical concepts like expectation values, variance, covariance, correlation functions and errors;
  • Estimation of errors using cross-validation, blocking, bootstrapping and jackknife methods;
  • Optimization of functions
  • Linear Regression and Logistic Regression;
  • Dimensionality reductions, from PCA to clustering
  • Boltzmann machines;
  • Neural networks and deep learning;
  • Convolutional Neural Networks
  • Recurrent Neureal Networks and Autoencoders
  • Decisions trees and random forests
  • Support vector machines and kernel transformations

We are targeting an audience of graduate students (both Master of Science and PhD) as well as post-doctoral researchers in nuclear experiment and theory.

The teaching teams consists of both theorists and experimentalists. We believe such a mix is important as it gives the participants a better understanding on how data are obtained, and what are the limitations and possibilities in understanding and interpreting the experimental information.

Introduction to the Talent Courses

A recently established initiative, Training in Advanced Low Energy Nuclear Theory, aims at providing an advanced and comprehensive training to graduate students and young researchers in low-energy nuclear theory. The initiative is a multinational network between several European and Northern American institutions and aims at developing a broad curriculum that will provide the platform for a cutting-edge theory for understanding nuclei and nuclear reactions. These objectives will be met by offering series of lectures, commissioned from experienced teachers in nuclear theory. The educational material generated under this program will be collected in the form of WEB-based courses, textbooks, and a variety of modern educational resources. No such all-encompassing material is available at present; its development will allow dispersed university groups to profit from the best expertise available.

Aims and Learning Outcomes

This two-week online TALENT course on nuclear theory will focus on Machine Learning and Data Analysis algorithms for nuclear physics and to use such methods in the interpretation of data on the structure of nuclear systems.

We propose approximately twenty hours of lectures over two weeks and a comparable amount of practical computer and exercise sessions, including the setting of individual problems and the organization of various individual projects.

The mornings will consist of lectures and the afternoons will be devoted to exercises meant to shed light on the exposed theory, the computational projects and individual student projects. These components will be coordinated to foster student engagement, maximize learning and create lasting value for the students. For the benefit of the TALENT series and of the community, material (courses, slides, problems and solutions, reports on students' projects) will be made publicly available using version control software like git and posted electronically on github (this site).

Learning Outcomes

At the end of the course the students should have a basic understanding of

  • Statistical data analysis, theory and tools to handle large data sets.
  • A solid understanding of central machine learning algorithms for supervised and unsupervised learning, involving linear and logistic regression, support vector machines, decision trees and random forests, neural networks and deep learning (convolutional neural networks, recursive neural networks etc)
  • Be able to write codes for linear regression, logistic regression and use modern libraries like Tensorflow, Pytorch, Scikit-Learn in order to analyze data from nuclear physics experiments and perform theoretical calculations
  • A deeper understanding of the statistical properties of the various methods, from the bias-variance tradeoff to resampling techniques.

Course Content and detailed plan of online Talent course

The lecture plan is as follows

Week 1

Week 2

  • Monday Neural Networks and Deep Learning (Michelle Kuchera and Raghu Ramanujan, MK and RR)
  • Tuesday From Neural Networks to Convolutional Neural Networks and how to analyze experiment (classification of events and real data) (MK and RR)
  • Wednesday Discussion of nuclear experiments and how to analyze data, presentation of simulated data from Active-Target Time-Projection Chamber (AT-TPC) (Daniel Bazin)
  • Thursday Generative models (MK and RR)
  • Friday Reinforcement Learning (MK and RR)
  • Friday Future directions in machine learning and summary of course.

Teaching

The course will be taught as an online intensive course of duration of two weeks, with a total time of 20 h of lectures and 10 h of exercises, questions and answers. Videos and digital learning material will be made available one week before the course begins. It is possible to work on a final assignment of 2 weeks of work. The total load will be approximately 80 hours, corresponding to 5 ECTS in Europe. The final assignment will be graded with marks A, B, C, D, E and failed for Master students and passed/not passed for PhD students. A course certificate will be issued for students requiring it from the University of Trento.

The registration link is at https://www.ectstar.eu/node/4472

The organization of a typical course day is as follows:

Time and Activity

  • 2pm-4pm (Central European time=CET) Lectures, project relevant information and directed exercises
  • 5pm-6pm (CET) Questions and answers, Computational projects, exercises and hands-on sessions

Teachers and organizers

The teachers and organizers are

  • Daniel Bazin at Department of Physics and Astronomy and National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing, Michigan, USA (DB)
  • Morten Hjorth-Jensen at Department of Physics and Astronomy and National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing, Michigan, USA (MHJ)
  • Michelle Kuchera at Physics Department, Davidson College, Davidson, North Carolina, USA (MK)
  • Sean Liddick at Department of Chemistry and National Superconducting Cyclotron Laboratory, Michigan State University, East Lansing, Michigan, USA (SL)
  • Raghuram Ramanujan at Department of Mathematics and Computer Science, Davidson College, Davidson, North Carolina, USA (RR)

Morten Hjorth-Jensen will also function as student advisor and coordinator.

Audience and Prerequisites

Students and post-doctoral fellows interested in nuclear physics, experiment and theory alike interested in data analysis and machine learning applied to nuclear physics.

The students are expected to have operating programming skills in programming, and in particular on interpreted languages like Python. Preparatory modules on Python programing will be given, idem for a review on linear algebra and statistical data analysis.

Students who have not studied the above topics are expected to gain this knowledge prior to attendance. Additional modules for self-teaching will be provided in good time before the course begins

Online Material:

For more information, please go to https://github.com/NuclearTalent/MachineLearningECT or go to http://nucleartalent.github.io/MachineLearningECT/doc/web/course.html for better display of course material and topics to be covered. Admission The target group is Master of Science students, PhD students and early post-doctoral fellows. Also senior staff can attend but they have to be self-supported.

Possible textbooks

Recommended textbook:

  • Aurelien Geron, Hands‑On Machine Learning with Scikit‑Learn and TensorFlow, O'Reilly

General learning book on statistical analysis:

  • Christian Robert and George Casella, Monte Carlo Statistical Methods, Springer
  • Peter Hoff, A first course in Bayesian statistical models, Springer
  • Trevor Hastie, Robert Tibshirani, Jerome H. Friedman, The Elements of Statistical Learning, Springer General Machine Learning Books:
  • Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press
  • Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer
  • David J.C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press
  • David Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press

About

For better displaying html files and course material use this link

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published