-
Website
-
3min Explainer Walkthrough
- Generate a Design
-
Page Management
-
Page Editing
-
Form Builder
- Updating Your Navigation Menu
- Creating Page Redirectors
- Posts Management
-
Search Engine Optimization
- Setting Up Google Marketing Tools
- Setting Up Meta Pixel & E-commerce Tracking
- Setting Up Microsoft Marketing Tools
- Setting Up Linkedin Insight Tag
- Optimizing Images For Web
-
Analytics
- Third Party Embed Code
- Special Files
-
3min Explainer Walkthrough
- Contacts
-
Marketing
-
Email Marketing
-
Automation
- SMS Marketing
-
Events Management
- Connecting Social Media
- Complying With Spam Laws (Australia)
- Changing Your Sender E-mail Address
- Change Links After Email Sent
-
Email Marketing
- Commerce
- Apps
-
Settings and Config
- Going Live
- Billing
- Domain Health Checker
- Managing Administrators
- Registering a Domain Name
- Changing Your Domain Name
- The Role of DNS Records Explained
- Backing Up Your Website
- Choosing an E-mail Host
- Hosting Email With Oncord
- Setting Up Gmail Hosting
- Setting Up Microsoft 365 E-mail Hosting
- Setting Up Sub Domains
- Hosting a Sub-Site
- Changes
- Special Files
- Website
-
-
3min Explainer Walkthrough
- Generate a Design
-
Page Management
-
Page Editing
-
Form Builder
- Updating Your Navigation Menu
- Creating Page Redirectors
- Posts Management
-
Search Engine Optimization
- Setting Up Google Marketing Tools
- Setting Up Meta Pixel & E-commerce Tracking
- Setting Up Microsoft Marketing Tools
- Setting Up Linkedin Insight Tag
- Optimizing Images For Web
-
Analytics
- Third Party Embed Code
- Special Files
-
3min Explainer Walkthrough
- Contacts
- Marketing
- Commerce
- Apps
- Settings and Config
Special Files
Oncord automatically generates and serves several special files that are essential for search engine optimisation (SEO) and site discoverability. These files are not stored on disk — they are dynamically generated by the platform when requested.
You don't need to create, edit, or optimise any of these files yourself. Oncord handles all SEO best practices automatically — your robots.txt, sitemap.xml, meta tags, syndication feeds, and Google Shopping feeds are always kept up to date as you add and update content. Every time you publish a page, product, or blog post, these files instantly reflect your changes and search engines are notified automatically. This means you can focus entirely on your content, knowing that the technical SEO fundamentals are already taken care of.
You can view these files at any time by appending the filename to your site's URL in a browser:
https://www.yourdomain.com/robots.txthttps://www.yourdomain.com/sitemap.xml
robots.txt
The robots.txt file tells search engine crawlers which pages they are allowed or disallowed from accessing. Every website needs one, and Oncord generates it automatically.
When a browser or search engine crawler requests /robots.txt, Oncord intercepts the request and dynamically builds the file contents. The generated file contains:
- A
User-agent: *directive (applies to all crawlers) - A
Disallowline for each page that has Disable Indexing turned on - A
Sitemapdirective pointing to the site's XML sitemap
For example:
User-agent: * Disallow: /internal-page/ Disallow: /private-area/ Sitemap: https://www.example.com/sitemap.xml
A page appears as Disallow in robots.txt when its Disable Indexing checkbox is ticked. This setting is found in the admin under Pages → Edit Page → Settings → SEO → Show Advanced Options → Disable Indexing / Spidering. Each subsite generates its own robots.txt with paths adjusted for its domain.
sitemap.xml
The sitemap.xml file provides search engines with a structured list of all indexable URLs on the site. It follows the Sitemaps
XML protocol
and is generated dynamically. All URLs are always generated using https:// — you never need to worry about insecure HTTP URLs
appearing in your sitemap.
What Gets Included
The sitemap includes URLs from multiple sources across the platform:
- Website Pages — all pages where indexing is enabled, the page is not hidden in sitemap, the page is online, and no login-wall security is applied
- Product Pages — all online products (if the Commerce Products component is installed)
-
Product Category Pages — all online categories plus the
/products/categories/index page (if categories and products exist) - Event Pages — upcoming, public events that do not require an invitation (if the Marketing Events component is installed)
Pages are excluded from the sitemap if they have Disable Indexing or Hidden in Sitemap enabled, are set to Offline, or have conditions-based security (login walls). On the primary domain, subsite pages are excluded as they have their own sitemap. Offline products, expired events, and invitation-only events are also excluded.
Each URL entry includes a loc (the full URL, prefixed with https://www. and the primary domain) and a lastmod (the last modified date in W3C format, when available). For example:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://www.example.com/</loc>
<lastmod>2026-03-15T10:30:00+10:00</lastmod>
</url>
<url>
<loc>https://www.example.com/about-us/</loc>
<lastmod>2026-02-20T14:00:00+10:00</lastmod>
</url>
</urlset>
Automatic Search Engine Pings
Whenever a page, product, product category, or event is saved, Oncord automatically notifies search engines that the sitemap has been updated by sending a ping request to Google and Bing. To avoid excessive requests, the ping is throttled to once per day and only runs on live (production) websites. The last ping time is displayed in the admin under the page's SEO settings tab.
Meta Robots Tag
In addition to the robots.txt file, Oncord outputs a <meta name="robots"> tag in the <head> of every
page. The tag is constructed from two page settings: Disable Indexing (controls index / noindex)
and Disable Follow (controls follow / nofollow). For example:
<meta name="robots" content="index,follow,max-image-preview:large,max-snippet:-1,max-video-preview:-1">
When indexing is enabled, Oncord appends additional directives that allow Google to display rich results: max-image-preview:large,
max-snippet:-1, and max-video-preview:-1. These allow large image previews, unlimited text snippets, and unlimited
video previews in search results.
Oncord also automatically sets noindex,nofollow on non-primary domains (such as staging sites on *.sslsvc.com) and
on subsite pages viewed through the primary domain before their own domain is live. This prevents duplicate content issues and stops staging
sites from being indexed, without any manual configuration.
HTML Sitemap Page
In addition to the XML sitemap for search engines, Oncord can include a human-readable HTML sitemap page at /sitemap/.
This displays a hierarchical list of all public pages — root-level pages shown as bold links with child pages listed beneath. Pages with Hidden
in Sitemap
enabled are excluded. This page is useful for visitors who want to see a complete overview of the site's structure, and it also provides
additional internal linking which can benefit SEO.
RSS and Atom Feeds
Oncord automatically generates syndication feeds for blog posts and products. These feeds allow visitors and third-party services to
subscribe to content updates. Feed URLs are injected into the page's <head> as <link rel="alternate">
tags so that browsers and feed readers can auto-discover them. By default, feeds are limited to the 25 most recent items, ordered by
creation date.
The following feed formats are available for both posts and products:
- RSS —
/feeds/posts/rss/or/feeds/products/rss/ - Atom —
/feeds/posts/atom/or/feeds/products/atom/ - XML —
/feeds/posts/xml/or/feeds/products/xml/ - JSON —
/feeds/posts/json/or/feeds/products/json/
Feeds can be filtered by category using a query string parameter, for example: /feeds/products/rss/?product_category_id=3. Each feed entry typically includes the title, a description or summary, the full URL, the publication date, and the author's name (for Atom feeds).
Google Shopping Feed
Oncord generates a dedicated Google Shopping product feed that conforms to the Google Merchant Center product data specification. This feed is designed to be submitted to Google Merchant Center for use with Google Shopping ads and free product listings. The feed is available in two formats:
- Google Shopping RSS —
/feeds/products/google-rss/ - Google Shopping Atom —
/feeds/products/google-atom/
Unlike standard product feeds, the Google Shopping feed includes all online products (no 25-item limit), uses the xmlns:g Google namespace, omits the publication date, and keeps descriptions concise (recommended 500–1000 characters). These feeds force a file download named products.xml, making them easy to submit to Google Merchant Center.
Each product entry includes the following Google fields: g:id, g:condition (set to "new"), g:image_link, g:availability ("in stock" or "out of stock" based on inventory), g:price (original price, e.g. "15.00 AUD"), g:sale_price (when a discount is active), g:gtin (product barcode), g:product_type (category hierarchy), and g:brand.
All feed URLs follow the pattern /feeds/{component}/{type}/. The feed type in the URL is automatically mapped to a method name — for example, google-rss maps to feedGetGoogleRss() — so any feed method following this naming convention is accessible via URL without additional configuration.
Other Special Files
Some third-party services require you to place a special file at the root of your website for verification or configuration purposes. You can upload any file to the root of your site by adding it to the media folder of your home page:
Website → Pages → Edit the Home Page → Media → Upload the special file
Any file uploaded to the home page's media folder will be publicly accessible at https://www.yourdomain.com/filename.ext. Common examples include:
- Domain verification files — services like Google Search Console, Facebook, Pinterest, and Microsoft Bing may ask you to upload an HTML file (e.g.
google1234abcd.html) to prove you own the domain - ads.txt — used by websites that display programmatic advertising to declare which advertising networks are authorised to sell ad inventory, helping to prevent ad fraud
- Security or policy files — such as
security.txtor other service-specific configuration files
llms.txt
The llms.txt file is an emerging proposed standard that provides structured information about a website specifically for
large language models (LLMs) and AI systems. Similar to how robots.txt communicates with search engine crawlers, llms.txt aims to help AI
models better understand and represent a site's content.
This file is not presently used by any major LLM.
Oncord does not currently generate or serve an llms.txt file, however we are actively monitoring the development of this standard and will implement any best practices automatically as they mature.