Hello! I am working on a project and I need some information from a certain Wiki website.
What I need is one section of several (over 500) wiki pages, automatically copied to a spreadsheet.
By hand it takes me about 20 seconds per page x 500 = 10 000 seconds = 160 minutes = 3 hours (loosely counted)
Is there any program which can do this? Or any other tricks?
Thanks!
netsplitter
Member
From: Melbourne
Registered: 2008-07-13
Posts: 183
Is that all the output you get? I'm not too familiar with Macs, but don't you need to do that as a privileged user (with sudo)?
Mesqueeb wrote:
I have python 3 and 2.7 both installed (don't know why) maybe that is jamming it?
It's possible. It looks like the "python" command is using python 2.7 (since that's where it installed to), so it should be working. Check anyway:
When you first enter "python", the very first line at the top will tell you which version it's running.
That should hopefully be 2.7.x
Blahah
Member
From: Cambridge, UK
Registered: 2008-07-15
Posts: 715
Website
Personally I use ActiveState python, and I have the 2.7 distribution installed.
Then, to install beautifulsoup, you type:
Then when you load a python shell, you type:
Note that this is case sensitive, so it has to be BeautifulSoup not beautifulsoup.