This week we have been working on another project called the “Cereal Project.” Its basically the same thing as the Django States project where you utilize a csv file and upload it to your database and then pull information from it based on what you want to see on your website. I can say I’m not having as much anxiety about databases as I was two weeks ago but I still feel like I need training wheels ( aka T.A. help ).
This week we have been spent time on deployment, requests, and scraping. Here is a little recap for you:
1) Deployment: Unfortunately everything you may or may have not heard about learning deployment is true. It sucks!!! My experience may be a little worse because the hosting company we were using was having a freakout about all of us using the same ip addresses and trying to connect to their servers at the same time. In the end it all worked out and we were able to proceed with deploymentI’m about 98% of the way deployed now. I can use my domain name and see the website I’ve created but only with manually typing in certain views. So hopefully in the next day or two I can get if fully deployed.
2) Requests: That is what we worked on today. I thought it was pretty cool. I have a few app ideas that would involve using an API so it was something I was interested in learning. We used a music API that we could get data from and then we imported it to our database and proceeded with building a web application. A lot of what we did was a review, but utilizing the API was new.
3)Scraping: Urllib2 and LXML. In short scraping is where you can go to a website and manually “scrape” data off of the webpage. Its an involved process where you are finding the xpath and then inputting it into the webpage you are building. I wasn’t super involved in this lesson because I was behind on my other projects. It is definitely something I will need to go back and review when I can.
That is all for now. More to come tomorrow 🙂
( Oct 15th, 2015 )