Support Center - Search Engine Optimization (SEO)
Home   Drag and Drop Interface   Pages Management   Domains   Mobile   SEO
Relevant to:
Express
This article is relevant to Express Builder users (registered after 01/01/2014).
Old Interface
This article is relevant to Old Interface users (registered before 12/31/2013).

Search Engine Optimization (SEO)

The system provides access to a number of SEO tools which allow the site owner to make changes related to the promotion of the website itself in search engines quickly and easily. Please note that using these tools can directly influence the functionality of the website and it is recommended that these changes are made by a professional in the field.

 

Location: In order to enter the SEO tool, click Manage>>More Options>>SEO

There you will see the following tools (click the name of the tool in order to see more information about it):

 

Keywords and website description

This window allows you to define a number of settings in the HEAD area of the code: keywords and site description by customizing the META tags. These settings are general settings for the entire website. You can make more specific settings for each page.

Back

 

HTML code in the head

In this window you can implement relevant tags and codes as needed. For example, Google Analytics verification code, links to external files, and more.

Back

 

Sitemap for search engines

This window contains a link to the sitemap for Google Webmaster Tools. When signing up to the tool, the registration interface asks you for a link to the sitemap which you can copy from here.

Back

 

Robots.txt file

First, please note that this tool is recommended for site owners who are well experienced in site promotion and SEO. Erroneous work with this tool can cause promotional damage to the website for long periods of time and even remove it from search engines. In the event that you are not sure, it is best not to make any changes - a wrong change is very hard to cancel.
 

 

What is the robots.txt file?

It is a file specifically designated for standard search engines (as opposed to search engines with the goal of collecting data for spam). Its purpose is to give them specific scanning commands before the search engine approaches the content itself. Taking this file into consideration requires search engines that meet the standards so that the file can contain specific commands to prevent scanning.

This file will always be at the highest level of a domain at a set place - in the event that the domain is example.com, the file will always be found at http://example.com/robots.txt

Common examples for use of this tool are preventing forums from being scanned, preventing search engines from scanning images, blocking forms, etc.


 

What is the advantage of using the robots.txt file?

The clear advantage of using this file is that there is no need to block content from the visitors themselves but you can influence what the search engine sees. In the event that a page is hidden, it will be hidden from search engines and visitors alike, and a page that is hidden in the robots.txt file will not be presented to search engines but will be visible to visitors.

In our system, there is an advantage to working with the file - the file includes a link to the sitemap. this saves you from the need to enter the sitemap into search engines because they "learn" the sitemap from their first encounter with the website.

 

Using the robots.txt file

You can work directly with the text box and type commands in it or use the command generator at the top of the page.

The command generator allows you to choose a user-agent and then give specific commands in the form of paths to the pages you would like to hide. The system includes a list of common user-agents and you can add the name of a specific one if it is not in the list. After entering the path, click Add Rule.

When you're finished making changes, click Apply and the file will be updated with the desired changes.

 

Examples of paths to the robots.txt file

Practically, you can add any path to the Folders and Files box - starting from the domain itself to a single page or a specific chain in a forum.

For example, in order to prevent access to a page, we can copy its address from the browser's address field and paste it in the robots.txt tool and then determine which search engines the rule is meant for.

In order to demonstrate use of the tool, we have put together a number of general paths to system components so you can copy them in the event that you are interested in blocking an entire tool from being scanned by search engines.

  • In order to prevent all forums under the domain from being scanned, paste /site/detail/forum in the text field.

  • If you'd like to prevent all site forms from being scanned by search     engines, use the path /site/form/ and add a rule based on it.

  • In order to prevent search engines from scanning the content in all of your albums, enter the path /site/detail/departalbum/ in the relevant field.

Back

 

301 redirections

The system allows you to create 301 redirections between pages in order to avoid situations in which a broken link exists when transferring an existing site into the system. This explanation will go over the essence of the redirection, how it is done and what it means.

In essence, a 301 redirection between pages (unlike a redirection between domains) was meant for specific situations in which a website is transferred from one location to the next and the site owner is interested in achieving two goals:

1. No broken links.

2. SEO efforts are not lost.

 

301 redirection (301 is a code that literally means "moved permanently") between pages was meant to "teach" the system servers what to do in the event that visitors reach an address that no longer exists in the system. Search engines know how to transfer the "authority" of an old link to a new link redirected with 301.

For example, if the "About" page of an old site was located at /example.com/pages/about and after building a new site and moving the domain, the address is now /example.com/9892/about, we wouldn't want visitors to get a 404 error (which means Page not Found).

 

How to create a 301 redirect between pages

In order to create a 301 redirect between pages, you need to enter the original link in the source field and the new link in the target field. As per the previous example, you would write in the source field:

/pages/about

 

and in the target field:

/9892/about

 
 
 

From that moment on, any attempt to visit the original link will redirect the user to the target link.

 

Important

  • This tool is especially relevant to site owners who had a website in a different system and transferred their domain to ours.

  • In order for the tool to work, the domain needs to be linked to the site (meaning, the new site needs to be live). It is recommended to handle redirections up front before it is actually required in order to prevent broken links.

  • Because the redirection settings have no meaning before moving the domain, the optimal time to define the redirection settings is right before     moving the domain.

  • 301 redirection is the accepted redirection in cases where a site is moved, so search engines know how to take it into consideration.

  • There are addresses for which you cannot define a redirection in order to prevent interrupting the system.

  • Redirections are valid for three months, a long enough time for search engines to     learn about the change and update their indices.

  • This tool is relevant for changes in page names in the same website. For example, if you changed the "About" pages to "About us", you can create a redirection between them and get back all the visitors who search for "About". Otherwise, they will get a 404 error and go back to the homepage.

  • Please pay attention to the structure of the page paths in the system and     enter the website number as part of the link.

  • For the time being, it is not possible to create redirections for pages with the extension ASPX.

  • You cannot create redirections between different domains.

  • You cannot create redirections to system pages like form pages, shopping cart, etc.

    Back

 
© 2014 Livecity. All rights reserved