1. Use unique Keywords for a website
Google keyword tool used to be a favorite keyword research tool among digital marketers and SEOs. But in recent years, Google has reduced the amount of keyword data available to people in Keyword Planner as well as with Google Analytics. A lot of people brushed off keyword research and started saying things like “just create great content” or “write content for people instead of the search engines.” While creating high-quality content is important, ignoring keyword research and SEO are foolish. The fact is that even though less data is available through Google’s tools, people still use keywords to find what they’re looking for. Finding out what keywords people are using is still as important as ever if you want to succeed in getting search engine traffic.
Mention keywords where they matter most. Include your main keywords in site title, description, tagline, blog, page titles.
2. Unique Content
The value of unique content comes from the danger of duplicate content. Duplicate content may serve your users just as well as unique content, but it doesn’t help search engines. Search engines want to find a single version of a piece of content and just send users there. To determine the best place to send users, they look at where the content was first published, and which site that has the content has the most/best links. If you have a website with lots of duplicate content, Google may view your website as “low value” and stop ranking even your unique pages.
Be sure your content is unique, well written and that will focus your primary keywords.
3. Don’t use Capital Letters
When and how to use capital letters can be a thorny problem. It may be acceptable to drop capital letters when writing casually to friends, but if you are writing anything more formal, then you need to use capital letters correctly. This page lists the rules and provides examples of when to use (and when not to use) capital letters in English writing.
Don’t use capital letters in URL.Because capital letters will give create confusion and it will make hard to remember, So better use small letters in URL.
4. Use Robots.txt to block bad URL
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.
Some of the URL’S have duplicate content, and it will penalize the set, block those URL’S through the Robots.txt so that only quality content with correct URL’s will be present and link with your website.
5. Use 301 Redirects
301 redirect is that a lot of confusion as to what to do when delete, move or want to redirect incoming web traffic from an ‘old’ web page to a new web page. The are many ways and technique which you can setup page redirect. It is different to learn how to set up a 301 redirect feature to work for your website. A 301 redirect is a most effective method for web page redirection from an old website page to a new web page location.
If you have changed the URL of a page or removed for any reason which Google has been ranking for a long time, and now it will not find because you have changed then your rankings will go down, use 301 redirect to redirect from old URL to new,so that Google can recognize your URL, and it will be in Google rankings.