-
-
Notifications
You must be signed in to change notification settings - Fork 45
03a. Configuring SEO Site Meta
These SEO Site Meta settings are used to globally define the Meta for the website. When no SEO Template Meta is found for a webpage, these settings are used by default.
They are used in combination with the SEO Template Meta settings to generate JSON-LD microdata, Dublin Core core metadata, Twitter Cards, Facebook OpenGraph, and as well as HTML meta tags.
If no Template Meta exists for a template, the SEO Site Meta is used.
If any fields are left blank in a Template Meta, those fields are pulled from the SEO Site Meta.
You can also dynamically change any of these SEO Meta fields in your Twig templates, and they will appear in the rendered SEO Meta.
-
Site SEO Name - This field is used wherever the name of the site is referenced, both at the trailing end of the
<title>
tag, and in other meta tags on the site. It is initially set to your Craft{{ siteName }}
. - Site SEO Title - This should be between 10 and 70 characters (spaces included). Make sure your title tag is explicit and contains your most important keywords. Be sure that each page has a unique title tag.
-
Site SEO Name Placement - Where the Site SEO Name is placed relative to the Title in the
<title>
tag -
Site SEO Name Separator - The character that should be used to separate the Site SEO Name and Title in the
<title>
tag - Site SEO Description - This should be between 70 and 160 characters (spaces included). Meta descriptions allow you to influence how your web pages are described and displayed in search results. Ensure that all of your web pages have a unique meta description that is explicit and contains your most important keywords.
- Site SEO Keywords - Google ignores this tag; though other search engines do look at it. Utilize it carefully, as improper or spammy use most likely will hurt you, or even have your site marked as spam. Avoid overstuffing the keywords and do not include keywords that are not related to the specific page you place them on.
- Site SEO Image - This is the image that will be used for display as the global website brand, as well as on Twitter Cards and Facebook OpenGraph that link to the website. It should be an image that displays well when cropped to a square format (for Twitter)
- Site Owner - The type of entity that owns this website.
- Site Twitter Card Type - With Twitter Cards, you can attach rich photos and information to Tweets that drive traffic to your website. Users who Tweet links to your content will have a “Card” added to the Tweet that’s visible to all of their followers.
- Site Facebook Open Graph Type - Adding Open Graph tags to your website influences the performance of your links on social media by allowing you to control what appears when a user posts a link to your content on Facebook.
- Site Robots - The robots meta tag lets you utilize a granular, page-specific approach to controlling how an individual page should be indexed and served to users in search results. Setting it to a blank value means 'no change'.
-
robots.txt Template - A
robots.txt
file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. The file uses the Robots Exclusion Standard, which is a protocol with a small set of commands that can be used to indicate access to your site by section and by specific kinds of web crawlers (such as mobile crawlers vs desktop crawlers).
SEOmatic automatically handles requests for /robots.txt
. For this to work, make sure that you do not have an actual robots.txt
file in your public/
folder (because that will take precedence).
If you are running Nginx, make sure that you don't have a line like:
location = /robots.txt { access_log off; log_not_found off; }
...in your config file. A directive like this will prevent SEOmatic from being able to service the request for /robots.txt
. If you do have a line like this in your config file, just comment it out, and restart Nginx with sudo nginx -s reload
.
The Preview Robots.txt button lets you preview what your rendered robots.txt file will look like.
You can use any Craft environmentVariables
in these fields in addition to static text, e.g.:
This is my {baseUrl}