Good tools remove friction from creative flow. During my research sprints for Cruz AI clients, I’m constantly reverse-engineering funnels and studying landing page markup. Chrome’s developer tools are powerful, but they’re also heavy when I just need the source and a clean copy button. So I scoped a focused web app: paste a URL, fetch the HTML via a CORS-friendly proxy, display it with legible typography, and make copying painless.
Success Criteria
- • Fetch most public webpages without server-side code.
- • Provide instant visual feedback (loading states, errors, success).
- • Keep the markup scrollable, readable, and copy-ready.
- • Ship quickly enough to fold into the Featured Applications gallery.
Designing a focused interface
I started with Tailwind CSS, the same utility framework powering the rest of the app gallery. The layout is intentionally compact: a centered card, one call-to-action, and no competing controls. A slim, uppercase feature grid underneath reinforces the value props without distracting from the input field.
Input validation happens before the fetch call even fires. If the URL fails new URL()
, the app nudges the user to correct it. That micro-guardrail saves API calls and clarifies what went wrong.
UI quality-of-life touches
- • Disabled button + spinner swap to signal progress.
- • Soft shadows and rounded corners to mirror other Cruz AI utilities.
- • Scrollbar theming so long documents still feel polished.
Engineering the fetch pipeline
Browsers block cross-origin requests to arbitrary domains, so the secret weapon here is AllOrigins, a free proxy that adds permissive CORS headers. The fetch flow is straightforward:
- Normalize the user input into a safe URL string.
- Request
https://api.allorigins.win/raw?url=<encoded target>
. - Stream the response text into the `
<pre>
` container. - Flip the UI state based on success or failure.
Error handling needed to stay human. Instead of surfacing raw network errors, the message clarifies whether the site might be blocking requests or if the proxy can’t reach it. That transparency is critical when analysts are on deadline.
Clipboard UX without surprises
Clipboard access varies wildly across browsers and embedded contexts. I opted for a reliable fallback: inject a hidden `<textarea>
`, use document.execCommand('copy')
, and clean up immediately. It’s not glamorous, but it works even when more modern Clipboard APIs are restricted.
A two-second state swap (“Copied!” → “Copy Code”) reassures the user that the action succeeded. That small touch makes repeated copying comfortable, especially when the markup spans hundreds of lines.
Friday Evening
Scoped the MVP, sketched the UI, and wired the Tailwind container.
Saturday Sprint
Implemented fetch logic, error states, and clipboard fallback.
Sunday Polish
Refined copy, added the feature grid, and integrated with the app gallery.
Limitations & next experiments
Public proxies are a gift, but they’re not bulletproof. Some sites deploy aggressive bot protection or require cookies. When that happens, the viewer surfaces a descriptive error so analysts know to grab the source via alternate means. Long term, I’ll add a serverless edge worker that signs requests with my own API key to guarantee higher throughput.
I also want to explore lightweight formatting—think syntax highlighting toggles or the option to prettify minified markup. For now, the focus is speed. The raw HTML arrives exactly as the server serves it, which is perfect for audits.
Try it yourself
Need to inspect a landing page or verify what a CMS is really shipping? Launch the tool, paste the URL, and copy the markup in seconds.
Launch the HTML Source Code Viewer →