There is a website that was created using ColdFusion (not sure if this matters or not). I need to interact with this web site. The main things I need to do are navigate to different pages and click buttons.
I have come up with two ideas on how to do this. The first is to use the WebBrowser control. With this, I could certainly navigate pages, and click buttons (According to This).
The other way is to interact with the html directly. Not sure exactly how to do this, but I am assuming I could click buttons or use HTML requests to interact with the page.
Does anyone have a recommendation on which way is better? Is there a better way that I haven't thought of?
why not submit directly the url? that's what the button click will do. using WebRequest.Create you can submit directly to the url. no need to load, parse and "click" the button.
Did you consider Selenium? The WebDriver API is quite good, and permits a lot of things in terms of Website automation.
I'd use Html AgilityPack to parse the html and then do POSTs and GETs appropriately with HttpWebRequest.
While it may be possible to use the WebBrowser control to simulate clicks and navigation you get more control with Html AgilityPack and HttpWebRequest regarding what gets sent
HtmlAguilityPack is useful for pulling the web elements and finding tags easily. If you need to remotely "steer" a web session, though, I prefer to use WatiN. It bills itself as a web unit testing framework, but it's very useful anytime you need to fake a browser section. Further, it can remote control different browsers well enough for most tasks you'll need (like finding a button and pushing it, or a text field and filling in text if you need a login).