Is there a way I can specify and get data from a web site URL on to a CSV file for analysis using R?
相关问题
- R - Quantstart: Testing Strategy on Multiple Equit
- Using predict with svyglm
- Reshape matrix by rows
- Extract P-Values from Dunnett Test into a Table by
- split data frame into two by column value [duplica
相关文章
- How to convert summary output to a data frame?
- How to plot smoother curves in R
- Paste all possible diagonals of an n*n matrix or d
- ess-rdired: I get this error “no ESS process is as
- How to use doMC under Windows or alternative paral
- dyLimit for limited time in Dygraphs
- Saving state of Shiny app to be restored later
- How to insert pictures into each individual bar in
base
read.csv
without theurl
function just works fine. Probably I am missing something if Dirk Eddelbuettel included it in his answer:Another options using two popular packages:
data.table
readr
Beside of
read.csv(url("..."))
you also can useread.table("http://...")
.Example:
scan
can read from a web page automatically; you don't necessarily have to mess with connections.In the simplest case, just do
plus which ever options
read.csv()
may need.Long answer: Yes this can be done and many packages have use that feature for years. E.g. the tseries packages uses exactly this feature to download stock prices from Yahoo! for almost a decade:
This is all exceedingly well documented in the manual pages for
help(connection)
andhelp(url)
. Also see the manul on 'Data Import/Export' that came with R.Often data on webpages is in the form of an XML table. You can read an XML table into R using the package XML.
In this package, the function
will look through a page for XML tables and return a list of data frames (one for each table found).