In order to be properly indexed by Google (and other search engines), we need to fix three basic values in the header: title, description and keywords. So in header layout we should include them as meta tags:
%title= content_for?(:title) ? yield(:title) : "Lebrijo.com - Internet applications Consultancy" %meta{content: content_for?(:description) ? yield(:description) : "We build high quality Internet application for our customers", name: "description"} %meta{content: content_for?(:keywords) ? yield(:keywords) : "ruby, ruby on rails, consultancy, webapps, developers", name: "keywords"} |
Another useful thing is creating a /sitemap.xml file where Engine-bots see your app structure. We can use the sitemap_generator gem, just including it in our Gemfile.
I have created a rake task for auto generating it:
desc "Generates sitemap" namespace :sitemap do task :generate do SitemapGenerator::Sitemap.default_host = 'http://www.lebrijo.com' SitemapGenerator::Sitemap.create do add '/', changefreq: 'daily', priority: 0.9 add '/', changefreq: 'daily', host: 'http://blog.lebrijo.com', priority: 0.8 add '/', changefreq: 'weekly', host: 'http://jenkins.lebrijo.com' add '/about', changefreq: 'weekly' add '/contact', changefreq: 'weekly' end SitemapGenerator::Sitemap.ping_search_engines end end |
And a Capistrano task, auto-generating sitemap on all deployments:
namespace :sitemap do desc "Generate sitemap.xml" task :generate do on roles(:app) do within release_path do with rails_env: fetch(:stage) do execute :rake, 'sitemap:generate' end end end end after "deploy:published", :generate end |
Finally be careful with your ‘robots.txt’ file configuration, because Google should be allowed to download and crawl your page.
Once everything is done and deployed in your application, just add it to Google WebMaster Tools. Don’t forget to add your sitemap.xml.gz URL.
Here you have a great link enumerating what other things you can do to improve your website SEO features.