I recently installed an SSL cert on one of my sites. I have noticed that Google has now indexed both the http and https version of each page. I haven't really noticed any problem ranking wise so far, but I am conscious that a problem may occur with duplicate content.
How can I overcome this? Only a few of my pages will be using https, most of the pages on the site will be best served with just http, in fact I could get away with not using https pages at all for the time being if necessary.
A few ideas I have come across are: 301 redirects, redirect all https to http with .htaccess.
Robots.txt for the ssl pages , again using .htaccess. The problem here is that the https pages have already been indexed and I would like them to be deindexed. I am not sure if robots.txt would be sufficient because as far as I am aware robots.txt will just tell a bot not to crawl the page, but it has already been indexed.
Is there any other suggestions?