I am currently on the job market seeking ML engineering & research opportunities starting Dec 2023!
            I'm a second-year masters student in Computer Science at The University of Texas at Austin. I'm advised by Prof. Chandrajit Bajaj and affiliated with the CVC Lab @ Oden Institute. My research is centered around Reinforcement Learning (RL), Bayesian methods, sequential decision-making ML and their applications. Currently, my work is motivated by problems in the domain of medical image diagnosis.
            Before coming to UT Austin, I worked on heteroscedastic uncertainty quantification in segmentation models and VAEs with Prof. Chandrajit Bajaj. Prior to that, I researched & engineered generative language models at the Web Science Lab @ International Institute for information Technology, Bangalore (IIIT-B) under Prof. Srinath Srinivasa.
            I graduated with my B.Tech. in Computer Science from Mahindra University, where I worked with Prof. Arya Bhattacharya on gradient-free optimization algorithms and Prof. Bruhadeshwar Bezawada on algebriac methods for symbolic execution, respectively. In my free time, I enjoy running, playing tennis (and squash), and traveling!
            News
            
                - Aug 2022: New preprint on optimal sampling for tensor decomposition out on Arxiv now!
 
                - Nov 2021: Gave a talk on Multi-Agent RL for efficient Rank-Ordered search @ Modern Inverse Problems Annual Meeting, RWTH Aachen University!
 
                - Jun 2021: Will be joining UT Austin MSc.CS program in Fall 2021!
 
                - Mar 2021: Two papers on training neural networks using differential evolution accepted @ IEEE CEC 2021!
 
            
            Preprints
            
                - Learning Generative Embeddings using an Optimal Subsampling Policy for Tensor Sketching.
 
                - Chandrajit Bajaj, Taemin Heo, Rochan Avlur
 
                - Under Review, 2022.
 
                - [arXiv][code soon]
 
            
            Publications
            
                - Comparative Performances of Neural Networks of Variant Architectures Trained with Backpropagation and Differential Evolution.
 
                - Zakaria Oussalem, Rochan Avlur, Jhanavi Malagavalli, Arya Bhattacharya
 
                - IEEE Congress on Evolutionary Computation (CEC), 2021.
 
                - [Paper]
 
            
            
                - Training Convolutional Neural Networks with Differential Evolution using Concurrent Task Apportioning.
 
                - Rochan Avlur, Zakaria Oussalem, Arya Bhattacharya
 
                - IEEE Congress on Evolutionary Computation (CEC), 2021.
 
                
                - [Paper]
 
            
            
                - Lossless video compression using Bayesian Networks and Entropy Coding.
 
                - Rochan Avlur, Chandrasekar Vaidyanathan
 
                - IEEE Region 10 Symposium, 2019.
 
                - [Paper]
 
            
            Projects
            
                - Investigating Causal Overhypothesis.
 
                - Investigated representations in parametric latent variable models (LVMs) & proposed weighted-mixture scheme to induce compositionality.
 
                - [Paper][Code]
 
            
            
                - Generalization via Adaptive Planning in Model-based RL.
 
                - Investigated role of trajectory evaluation for open-loop model-predictive planning in model-based RL.
 
                - [Paper][Code]
 
            
            
                - Controlling Estimation Bias in Q-learning.
 
                - Proposed a multi-arm bandit setup for online adaptive control of estimation bias in Q-learning based on Lan et al.
 
                - [Paper][Code]
 
            
            
                - Learning Node Characteristics for Reliable Leader Election.
 
                - Proposed a multi-agent multi-arm bandit algorithm for consensus in asynchronous distributed systems; reduced node failure up to 20%.
 
                - [Paper][Code]
 
            
            
                - Representation Damage in Pruned Multi-modal Transformers.
 
                - Investigated influence pruning large multi-modal models (CLIP) have on latent representations by analyzing
                    multi-modal neuron activations & performance on downstream tasks.
 
                - [Paper]