Hi, I'm Haseeb, a final-year CS undergraduate at LUMS (exp. 2026) working in efficient, generalizable, and interpretable machine learning. My research focuses on model compression and domain generalization, with an emphasis on understanding failure modes in modern neural networks and designing more robust learning systems.

I currently work as a Research Assistant at the Centre for Urban Informatics, Technology and Policy (CITY) at LUMS under Dr. Muhammad Tahir and Dr. Zubair Khalid. My work includes the Backbone Contrastive Pruning (BaCP) framework to mitigate representational collapse in highly sparse networks. My core focus is using mechanistic interpretability to analyze failure modes in distributed and domain-shift settings. This involves my work on reframing weight divergence phenomena in Federated Learning as "circuit collapse". I am also researching on decomposing models into domain-invariant and domain-specific circuits to better understand and address performance degradation under distribution shift.

I have also been associated with the external company 10x Engineers on quantization of diffusion models, where I developed a method for training lightweight MLPs on per-layer timestep statistics to predict quantization scale and zero-point.

Previously, I contributed to research at the Computer Vision & Graphics Lab, where I developed KLAWQ, a GPTQ-inspired quantization method improving perplexity by up to 30% over vanilla GPTQ. I also engineered a transformer-based framework with mathematical constraints for single-image camera calibration. I was selected for the UIUC Summer Research Internship 2025, working with Dr. Darko Marinov on LLMs for automated software engineering tasks, and briefly worked with Dr. Reyhaneh Jabbarvand on C-to-Rust translation using LLMs.

Beyond academia, my contract role as a Machine Learning Engineer at Innova Tech involved quantizing and deploying vision models on edge devices using ONNX and TensorRT. I am also an active open-source contributor to libraries such as timm and adapters.

Experience & Teaching

Machine Learning Engineer (Contract) at Innova Tech
Owned optimization pipeline for on-device object detection. Led ONNX/TensorRT quantization, reducing memory by 40% with real-time inference. Automated data annotation pipeline, improving accuracy by 4%.
Teaching Assistant, CS436: Computer Vision at LUMS
Designed assignments on PyTorch transfer learning and C++/OpenCV pipelines. Mentored 40 groups (80+ students) on SfM-based virtual tour projects.

Open Source Contributions

Contributor, pytorch-image-models (timm)
Implemented F1, precision, recall metrics for distributed training using torch.distributed.
Contributor, adapters (PEFT Library)
Added GQA support for Llama-2/Mistral PEFT. Fixed tensor shape mismatch bug, enabling PEFT for modern LLMs.

Research Interests

  • Model Compression (Pruning & Quantization)
  • Domain Generalization & OOD Robustness
  • Mechanistic Interpretability
  • AI on Edge Devices
  • Camera Calibration & 3d Reconstruction