According to capybara's README:
The two following statements are functionally equivalent:
page.should_not have_xpath('a') page.should have_no_xpath('a')
However, when trying this out, that does not appear to be true. This seems to work fine when using capybara-webkit:
visit dashboard_accounting_reports_path
click_link 'Delete'
page.should_not have_css('#reports .report')
But when using poltergeist, it often fails with this, which seems to say it's using #has_css?
under the covers, which won't actually wait for the given element to disappear:
Failure/Error: page.should_not have_css('#reports .report')
expected #has_css?("#reports .report") to return false, got true
If I change the assertion to read like this instead, it seems to succeed every time:
page.should have_no_css('#reports .report')
Am I crazy, or is this a bug in poltergeist? I'm using PhantomJS 1.8.2, poltergeist 1.1.0, and capybara 2.0.2.
Here's the debugging output from the should_not have_css
example: http://pastebin.com/4ZtPEN5B
And here's the one from the should have_no_css
example: http://pastebin.com/TrtURWcZ