project · wirefall

WireFall

Team Lead

Self-Hardening Transformer Web Application Firewall via Incremental Learning

SIH 2025 Top 5DistilBERTLoRALLM Agent
  • Built a firewall that catches zero-day attacks utilizing a DistilBERT encoder trained via Masked Language Modeling (MLM) for reconstruction loss, coupled with a frozen contrastive [CLS] head to detect zero-day anomalies with 93.78% recall.
  • Red-Team self-hardening loop was designed where a fine-tuned Llama agent interprets high-loss anomalies to autonomously generate and inject regex signatures into the static engine, instantly patching new exploits.
  • Acontinual learning framework was implemented to prevent catastrophic forgetting, leveraging Jensen-Shannon (JS) Divergence for stable knowledge distillation; derived a gradient-sensitivity metric to dynamically rank and stack LoRA adapters based on optimization relevance.
Visit repo