How to Prevent PDFs and other Squarespace file uploads from Being Indexed by Search Engines
If you’ve uploaded PDF files to your Squarespace website, you might have noticed that these files can be indexed by Google and other search engines.
This can be a concern if you want certain files to remain private or hidden from search engines.
The Challenges
PDFs uploaded to Squarespace are stored on a CDN that isn’t secure or private.
While Squarespace ensures these files aren’t listed in the sitemap, they can still be discovered and indexed if crawlers find the links elsewhere.
Squarespace’s limited customization options make it tricky to block these files using traditional methods like
robots.txt
or server-side configurations.
The Solution
NB. This solution will work for all files uploaded to your Squarespace site via the file upload dialog.
To prevent PDF files from being indexed, I’ve developed a simple JavaScript-based workaround. This solution obscures the links to your PDFs in a way that prevents crawlers from finding them while still allowing users to access the files.
Step 1: Modify the PDF Links
Change your PDF links to an intentionally broken URL format, as follows:
Original:
/s/my-document.pdf
Modified:
/scramble/my-document.pdf
This broken link ensures crawlers can’t follow the path to the file. But neither can your site visitors, which is where Step 2 comes in.
Step 2: Use JavaScript to Fix the Links Dynamically
The JavaScript will listen for user clicks on the broken links, correct the URL, and then redirect the user to the actual file.
Understanding Squarespace’s /s/
Folder
When you upload files to your Squarespace site, they are stored in a relative folder called /s/
on the Squarespace CDN.
This /s/ folder is used for all uploaded files, such as PDFs, images, and other media.
Squarespace does not include password protection for these files, which is why additional steps are needed to manage their visibility.
Instructions for Adding the JavaScript
1. Open the Footer Code Injection Area
Log in to your Squarespace site.
Navigate to Settings > Advanced > Code Injection.
Scroll to the Footer section.
2. Paste the Following Code
Insert the following script into your site’s footer code injection section:
<script> // Protect PDFs and other file uploads from being indexed by Google // Copyright (c) Colin Irwin - https://silvabokis.com document.addEventListener('DOMContentLoaded', function() { document.querySelectorAll('a[href*="/scramble/"]').forEach(function(link) { link.addEventListener('click', function(event) { event.preventDefault(); const scrambledUrl = link.getAttribute('href'); const originalUrl = scrambledUrl.replace('/scramble/', '/s/'); window.location.href = originalUrl; }); }); }); </script>
How It Works
The script identifies all links on the page that start with
/scramble/
.When a user clicks one of these links, the script:
Prevents the default action that would result in a 404 error.
Replaces
/scramble/
with/s/
in the link.Redirects the user to the valid URL for the file.
Benefits
Privacy: Search crawlers can’t index the broken links.
Accessibility: Users can still access the files seamlessly.
Ease of Use: No need for advanced server-side changes or edits to
robots.txt
.
Limitations
This solution won’t work on Squarespace’s lowest subscription tier, which doesn’t support code injection.
It doesn’t fully secure the files—determined users could still find and access them if they know the exact URL.
Conclusion
This JavaScript plugin is an effective and simple way to obscure PDF links on your Squarespace site and prevent them from being indexed by search engines. While it’s not a foolproof security measure, it provides a practical workaround for most use cases. Add the script to your site today to protect your content and maintain better control over your files.