The purpose of the Competence Center on Simulation and Big Data is the fostering of an effective collaboration between the different LIP groups working on these areas and to boost the capability to exploit the existing expertise both internally and externally, towards the university and the industry. The different LIP groups have a vast range of competences in data analysis and simulation tools, including physics models, Monte Carlo generators, detector simulation tools, big-data handling techniques and data mining. The ability to fully benefit from such competences requires achieving critical mass, a coordinated training program, the exploitation of synergies between groups and a clear identification of the key areas where we can contribute in a competitive way.
The competence center started its activities in 2017 and the first priorities were the identification of the technical competences mastered by the LIP members in these two areas, establishing communication and discussion forums, starting a training program and establishing an action plan for the next few years.
Related links
Simulation
In 2017, the Simulation branch of the competence center undertook a survey of the GEANT4 competences at LIP. The following items were identified:
- LIP is a member of the GEANT4 collaboration for more than 10 years, accumulating an important expertise, both from the user and developer points of view, with an important know-how beyond applications development;
- LIP members hold expertise in several GEANT4 kernel categories;
- There is a potential to increase LIP’s contribution to the GEANT4 toolkit;
- LIP members undertake teaching activities in MSc and PhD level courses with some emphasis in GEANT4.
Big Data
The Big Data branch of the competence center developed a survey of the big-data and machine learning competences at LIP, and the following items were identified:
- Development of multivariate data analysis using advanced techniques (e.g. boosted decision trees, shallow and deep neural networks and principal component analysis);
- Expertise in modern tools used in HEP and beyond it (e.g. TMVA, Octave, Keras, SK-learn, Pandas, Theano, Tensorflow);
- Expertise in advanced methods for training and validation of multivariate analysis (e.g. use of accelerators such as GPUs, distributed training and cross-validation);
- Expertise in complex file systems and tools to deal with large volumes of data.
Suporte
Email: helpdesk@incd.pt
-
Quantum machine learning in HEP
-
Author(s):
Pedro Carvalho, Bruna Salgado, Gabriel Domingues, Catarina Felgueiras
-
Submission: 2023-12-31, Acceptance: , Publication: 2023-12-31
-
Reference:
LIP-STUDENTS-23-27
View publication
-
Anomaly detection as a tool for discovery the unexpected at colliders
-
Author(s):
Simão Silva Cardoso, Daniel Sousa, João Ferreira, Fábio Lucas Carneiro
-
Submission: 2023-12-31, Acceptance: , Publication: 2023-12-31
-
Reference:
LIP-STUDENTS-23-26
View publication
-
Fitting a Collider in a Quantum Computer: Tackling the Challenges of Quantum Machine Learning for Big Datasets
-
Author(s):
Miguel Caçador Peixoto, Nuno Filipe Castro, Miguel Crispim Romão, Maria Gabriela Jordão Oliveira, Inês Ochoa
-
Submission: 2023-07-28, Acceptance: 2023-11-20, Publication: 2023-12-15
-
Reference:
Front. Artif. Intell. 6 (2023) 1268852
-
Berry: A code for the differentiation of Bloch wavefunctions from DFT calculations
-
Author(s):
Leander Reascos, Fábio Carneiro, André Pereira, Nuno Filipe Castro, Ricardo Mendes Ribeiro
-
Submission: 2023-05-31, Acceptance: 2023-10-17, Publication: 2023-10-27
-
Reference:
Computer Physics Communications 295 (2024) 108972
View publication
Related links
//Coordinators
Nuno Castro
Bernardo Tomé
Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure.
Lorem Ipsum passage, and going through the cites of the word in classical literature.