My honest friends and superiors agreed that my biggest weekness is software development, so that's what I picked as a part of my career 😎
🔭 I’m currently a Modeling and simulation specialist, a machine learning staff scientist at Idaho National Laboratory, and a member of RAVEN development team, working on several projects including -but not limited to- Surrogate Construction, Reduced Order Modeling, sparse sensing, metamodeling of porous materials, scaling interpolation and representativity of mockup experiments to target real-world plants, data-driven discovery of governing physics and system identification, digital twins, Time series analysis, Koopman theory, agile software development, and more.
🌱 I’d love to learn in the near future: MLOps, R, Cafee, mongoDB, MySQL,NoSQL, SCALA, Julia, SAS, SPSS, ApacheSpark, Kafka, Hadoop, Hive, MapReduce, Casandra, Weka.
🧑🤝🧑 I’m looking to collaborate on Physics-based neural networks.
- 💬 Ask me about ROM, uncertainty quantification, sensitivity analysis, active subspaces, probabilistic error bounds, dynamic mode decomposition (DMD).
⚡ Fun fact: I like basketball, volleyball, and soccer.
🐦 [twitter][twitter]|📺 [youtube][youtube]|📷 [instagram][instagram]|
🤖👽Machine Learning: regression, regularization, classification, clustering, collaborative filtering, support vector machines, naive Bayes, decision trees, random forests, anomaly detection, recommender systems, artificial data synthesis, ceiling analysis, Artificial Neural Networks (ANNs), Deep Neural Networks (DNNs), Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Long Short Term Memory (LSTMs), Natural Language Processing (NLP), Transformer models, Attention Mechanisms.
Reduced Order Modeling: PCA, PPCA, KPCA, isomap, laplacian eigenmaps, LLE, HLLE, LTSA, surrogate modeling, Koopman theory, time-delayed embeddings, dynamic mode decomposition (DMD), dynamical systems and control, data-driven (equation-free) modeling, sparse identification of dynamical systems (Sindy), compressive sensing for full map recovery from sparse measurements, time-series analysis, ARMA, ARIMA.
Sensitivity Analysis (SA): Sobol indices, morris screenning, PAWN, moment-independent SA.
Uncertainty Quantification (UQ): Forward UQ, adjoint UQ, invers UQ.
Optimization: Gradient-Based Optimizers, conjugate gradient, Metaheuristic: Simulated Annealing, Genetic Algorithms.
🖥️Programming Languages and Packages: Bash scripting, MATLAB, Python: numpy, scipy, matplotlib, plotly, bokeh, seaborn, pandas, Jupyter notebook, ScikitLearn, Keras, Tensorflow.
** High Performance Computing (HPC)**
- 🕯️Machine Learning - Stanford|Online| Intro to ML. (i) Supervised learning (parametric/non-parametric algorithms, support vector machines, kernels, neural networks). (ii) Unsupervised learning (clustering, dimensionality reduction, recommender systems, deep learning). (iii) Best practices in machine learning (bias/variance delimma)
- 🕯️Neural Networks and Deep Learning - DeepLearning.AI| Build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture
- 🕯️Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization - DeepLearning.AI| L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; optimization algorithms such as mini-batch gradient descent, Momentum, RMSprop and Adam, implement a neural network in TensorFlow.
- 🕯️Structuring Machine Learning Projects - DeepLearning.AI| Diagnose errors in a machine learning system; prioritize strategies for reducing errors; understand complex ML settings, such as mismatched training/test sets, and comparing to and/or surpassing human-level performance; and apply end-to-end learning, transfer learning, and multi-task learning.
- 🕯️Convolution Neural Networks - DeepLearning.AI| Build a convolutional neural network, including recent variations such as residual networks; apply convolutional networks to visual detection and recognition tasks; and use neural style transfer to generate art and apply these algorithms to a variety of image, video, and other 2D or 3D data.
- 🕯️Sequence Models - DeepLearning.AI| Natural Language Processing, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network, Attention Models
- 🕯️Deep Learning Specialization - DeepLearning.AI|
PinnedLoading
- idaholab/raven
idaholab/raven PublicRAVEN is a flexible and multi-purpose probabilistic risk analysis, validation and uncertainty quantification, parameter optimization, model reduction and data knowledge-discovering framework.
- scipy_2022_causal_inference_tutorial
scipy_2022_causal_inference_tutorial PublicForked fromronikobrosly/scipy_2022_causal_inference_tutorial
A set of decks and notebooks with exercises for use in a hands-on causal inference tutorial session
Jupyter Notebook 1
- deepscm
deepscm PublicForked frombiomedia-mira/deepscm
Repository for Deep Structural Causal Models for Tractable Counterfactual Inference
Jupyter Notebook 1
- AI-For-Beginners
AI-For-Beginners PublicForked frommicrosoft/AI-For-Beginners
12 Weeks, 24 Lessons, AI for All!
Jupyter Notebook 1
- bel2scm
bel2scm PublicForked frombel2scm/bel2scm
Generate dynamic structural causal models from biological knowledge graphs encoded in the Biological Expression Language (BEL)
Jupyter Notebook 1
- avici
avici PublicForked fromlarslorch/avici
Amortized Inference for Causal Structure Learning, NeurIPS 2022
Python 1
If the problem persists, check theGitHub status page orcontact support.
Uh oh!
There was an error while loading.Please reload this page.