Detailed surveys studying web users’ behaviour indicated that majority of them love having information in their own language. Many non-English-speakers prefer using localized websites i.e. the pages that are rendered by the server in their own language taking into account their location on the map.
Multiregional site is the one that targets specifically users of various regions; the site available in different languages is called the multilingual site. Few things need the use of due care while developing websites that are multilingual or multiregional or both. Web developers need to make some general preparations before embarking on.
First thing first, developing a website that covers multiple regions/languages is a challenging task. The reason being that if anything goes wrong with the base version can drag you dealing with a lot of stuff in other versions as well. So, planning and making sure that everything works perfect in the first place must be the first check box marked. Ensure that you have proper infrastructure to support all this.
Second, you should take into account the legal and administrative implications while developing multiregional websites for example there can be a ban on using country specific domain name as per the laws of that country. Making sure no legal issues would arise would comfort you and boost your confidence that your site is going to be served well among all your target audience.
Logico studies reveal the fact that Google differentiates between ccTLDs and gTLDs when it comes to domains. ccTLDS are country-code top level domains for example domains like .pk for Pakistan, .cn for China, .br for Brazil. Such domain are tied to a specific country and search engines detect this as a very strong indication that the target audience is that country. gTLDs are generic top level domains e.g. .com, .info, .co, or .org etc. Google Webmaster guide for Geographic target setting can be referred while setting geotargets for gTLDs websites.
Google generally uses the following elements to determine the geotargeting of a website (or a part of a website):
Use of a ccTLD is the strongest signal for search engines to render a website version in a specific country.The other way round, Webmaster Tools' manual geotargeting can be used for gTLDs on a domain, subdomain or subdirectory level.Region tags from geotargeting shown in search results is also a very clear method to users.
Server location (through the IP address of the server) is frequently near your users. However, some websites use distributed content delivery networks (CDNs) or are hosted in a country with better webserver infrastructure, so we try not to rely on the server location alone.
Server locations, Content Delivery Networks (CDNs), local addresses and phone numbers on web pages, language used, currencies and links from other local sites along with use of Google Maps all are indications hints for search engines while determining what to render where. Locational meta tags like geo.position or HTML attributes used for geo-targeting are not much relied on by search engines.
The first three elements used for geo-targeting are strongly tied to the server and to the URLs used. It's difficult to determine geotargeting on a page by page basis, so it makes sense to consider using a URL structure that makes it easy to segment parts of the website for geotargeting. Here are some of the possible URL structures with pros and cons with regards to geotargeting:
For example: site.com?loc=se, ?country=france, etc.
It is worth mentioning here that Geo-Targeting is not an exact science because sites with ccTLDs can be gloabal for example, so it’s vital to plan for the users from wrong location. The technique you can simply use to avoid this is to put the options for the user to pick their country and language on your web pages.
Dealing with duplicate content on global websites
Sometimes these multilingual and multiregional websites offer same content for different URLs which is not a bad thing if different content is for different people in different countries. Still Logico web development and data analysis teams recommend you to produce unique content for unique set of users though it seems not to be possible always. Disallowing crawlers from crawling by manipulating robots.txt file or setting content value to ‘noindex’ is not a good practice to follow.