In the following blog post, you will learn:
Over the last few weeks I have been working on a web app that need to ability to query hundreds, thousands or even millions of pages - Currently the biggest I've tested it on in one go was nearly 250,000 which took about an hour and a half to complete. This was before I researched Python Threading!
Here's a quick script for dealing with merged CSV cells. Basically, if you need to unmerge a bunch of cells and duplicate what the value was into the split cells, here's an easy was to do it with Python.
I've just had a clients site go offline due to a server change, it was a quick move, so no big deal, however, I wanted to check the status of the site to see when it came back online but without having to keep checking on it - Enter a quick Python script.
Python Classes have been difficult to learn, the tutorials for them are either to in-depth, complicated or to simplistic that they don't give you a good feel for how you might use one, I'm going to try and hit the sweet spot, it should be noted though that I am still very much a noob when it comes to the Python class.
As the title suggests, I have been quietly working away building a web app in my spare time, there is more to do yet but it is arguably already a useful tool and with a bit of polish I could probably launch it. Without giving to much away it basically diagnoses indexing issues within Google. That's it.
I wanted to keep track of a ranking position today so I repurposed some existing code I had and came up with something similar to the below - I have changed it to be more general but really the only difference is that it only printed to the screen if the ranking domain was the one I was interested in (it was a new page and I was testing how long it would take to rank, if at all) and it just checked the same query every 10 minutes or so.
Today I was tasked with renaming over a thousand files, this comes off the back of a recommendation to the client to rename them to be more specific to what they are, so essentially they are changed to their product name from their product code. This is a fairly standard practice and it's something worth doing, especially if your product is fairly visual, as ranking an image is a piece of cake.
So as the title suggests and as you may or may not have noticed I have a new design and a whole new backend. After getting more and more fed up with having a Wordpress powered site, I thought it was time to update things and use my own system - which a dev at work has dubbed 'CraigPress'.
When I first saw a list comprehension I simply shook my head in utter confusion, the fact is that they are actually fairly straightforward, sure you can make them more and more complicated but here are some basic examples to get you started...
So I've done a little web scraping, mainly for myself, a little for work purposes but a few nights ago my wife was telling me about work and a task she had to do for one of her colleagues, basically fetching data from a website that was full of contractors information from within her industry - intrigued I took a look and thought I could probably automate this :)
I wrote the following script in an attempt to monitor my clients-sites uptime, essentially if a sites goes down for whatever reason, I will be notified via email, this doesn't include sites hosted by ourselves (Bronco) as they are monitored already, this is for sites where we only do consulting and they are hosted by others.
I have had parse a number of XML sitemaps this week for different reasons so I thought I would make it a little easier and quicker. There are specific standard libraries for parsing XML but this is what I came up with...
There are obviously dozens of reasons to want to see Twitter and FaceBook shares, so I have written a surprisingly simple script to do the job for me - I thought this would go into 50+ lines of code at least. Obviously this is only checking against one URL but a slight modification and I could feed in multiple URLs of competitors, a news or blog section or an entire site.
I wrote this script as I was getting pissed off with Excel crashing when checking a measly 100,000 links; working with formulas, specifically VLOOKUP can be a real nightmare!
When working with large datasets I tend to use Python as it's a lot faster then excel for file manipulations and doesn't crash on large inputs.
I've just been trying to check for canonical issues on a sites domain and my usual tool of choice was showing response codes that I though were incorrect, it was almost as if there was a Meta refresh happening.
So having recently got married, my Mrs wanted all the wedding photos turning into grey scale, I originally planned to do this with pixlr.com one at a time as I thought she only wanted a few doing - it turned out to be all of them and there are around 600 of them!
I'm still on my path to Pythonic enlightenment and today on my break I wrote a little script that I'm now going to be using to keep my desktop tidy and my files well organised. Basically I'm terrible for saving my files to my desktop; not only is this messy and unorganised, it's dangerous as if my computer dies, my work might with it - so my files should be stored on the company's backed-up common drive.