According to the Twitter api, they'll allow you to extract a maximum of 3,200 tweets, 200 at a time, to do this I'm using Twython, a Python wrapper for the api. In the following tutorial, I'm going to show you how you can get all tweets from a user.
I've always been a fan of automation, to me it makes total sense. At Bronco I automate many of my day to day tasks as well as those which are less frequently used but are often more mentally taxing.
Here's my intro to data slicing with Python Pandas (using Jupyter)...
So, I'm quite a fan of Linkedin and I've recently got into the habbit of liking people's statuses and updates, as well as trying to expand my network past annoying recruiters (sorry guys) by adding people on a daily basis.
Content is king! Didn't you know?! You've probably heard this, that many times, you're close to ripping the face off the next person who says it to you. You may have noticed, not many of those spouting "quality content" can give a clear definition, I'm going to see if I can.
Rand of Moz was kind enough to agree to be interviewed, here he we talk about his blackhat days, costs of running Moz, company culture, customer loyalty and much more, enjoy.
Over the last few weeks I have been working on a web app that need to ability to query hundreds, thousands or even millions of pages - Currently the biggest I've tested it on in one go was nearly 250,000 which took about an hour and a half to complete.
Python Classes have been difficult to learn, the tutorials for them are either to in-depth, complicated or to simplistic that they don't give you a good feel for how you might use one, I'm going to try and hit the sweet spot, it should be noted though that I am still very much a noob when it comes to the Python class.
As the title suggests, I have been quietly working away building a web app in my spare time, there is more to do yet but it is arguably already a useful tool and with a bit of polish I could probably launch it. Without giving to much away it basically diagnoses indexing issues within Google. That's it.
I wanted to keep track of a ranking position today so I repurposed some existing code I had and came up with something similar to the below - I have changed it to be more general but really the only difference is that it only printed to the screen if the ranking domain was the one I was interested in (it was a new page and I was testing how long it would take to rank, if at all) and it just checked the same query every 10 minutes or so.
Using a persona for outreach purposes merely improves your chances of conversion, don't see a persona as being dishonest it's just a part of you that likes a particular topic.
I'm tired of reading about content curation at the moment so I thought I would share a random thought and just an FYI I'm not a black hat or anything like that, I just find the idea interesting. :)
I thought I'd throw together some of the blogs on my regular reading list to show my appreciation as well as help broaden some peoples reading lists. I'll keep adding to the list as and when..
Today I was tasked with renaming over a thousand files, this comes off the back of a recommendation to the client to rename them to be more specific to what they are, so essentially they are changed to their product name from their product code. This is a fairly standard practice and it's something worth doing, especially if your product is fairly visual, as ranking an image is a piece of cake.
So as the title suggests and as you may or may not have noticed I have a new design and a whole new backend. After getting more and more fed up with having a Wordpress powered site, I thought it was time to update things and use my own system - which a dev at work has dubbed 'CraigPress'.
When I first saw a list comprehension I simply shook my head in utter confusion, the fact is that they are actually fairly straightforward, sure you can make them more and more complicated but here are some basic examples to get you started...
So I've done a little web scraping, mainly for myself, a little for work purposes but a few nights ago my wife was telling me about work and a task she had to do for one of her colleagues, basically fetching data from a website that was full of contractors information from within her industry - intrigued I took a look and thought I could probably automate this :)
I wrote the following script in an attempt to monitor my clients-sites uptime, essentially if a sites goes down for whatever reason, I will be notified via email, this doesn't include sites hosted by ourselves (Bronco) as they are monitored already, this is for sites where we only do consulting and they are hosted by others.
I have had parse a number of XML sitemaps this week for different reasons so I thought I would make it a little easier and quicker. There are specific standard libraries for parsing XML but this is what I came up with...
There are obviously dozens of reasons to want to see Twitter and FaceBook shares, so I have written a surprisingly simple script to do the job for me - I thought this would go into 50+ lines of code at least. Obviously this is only checking against one URL but a slight modification and I could feed in multiple URLs of competitors, a news or blog section or an entire site.
I wrote this script as I was getting pissed off with Excel crashing when checking a measly 100,000 links; working with formulas, specifically VLOOKUP can be a real nightmare!
When working with large datasets I tend to use Python as it's a lot faster then excel for file manipulations and doesn't crash on large inputs.
I've just been trying to check for canonical issues on a sites domain and my usual tool of choice was showing response codes that I though were incorrect, it was almost as if there was a Meta refresh happening.
Wow what a year, most notable was of course getting married, after 7 years we finally did it. A small, intimate family gathering at Gretna Green; it was a fantastic day with smiles all round.
So having recently got married, my Mrs wanted all the wedding photos turning into grey scale, I originally planned to do this with pixlr.com one at a time as I thought she only wanted a few doing - it turned out to be all of them and there are around 600 of them!
I'm still on my path to Pythonic enlightenment and today on my break I wrote a little script that I'm now going to be using to keep my desktop tidy and my files well organised. Basically I'm terrible for saving my files to my desktop; not only is this messy and unorganised, it's dangerous as if my computer dies, my work might with it - so my files should be stored on the company's backed-up common drive.
Today we have an interview with Paul May of the popular link building tool BuzzStream. So once again without further ado, here it is...Enjoy! (Thanks again to Paul for taking the time out to do this interview, if anyone has any questions leave them in the comments below!)
Sometimes for a client or your company you need to come up with content ideas. To fall in line with seasonality of your niche etc you can often come up with content ideas far in advance but there are always new topics that crop up and its a good idea to jump on them as soon as possible.
Lately, I have heard lots of talk on conversion rate optimisation, while it's nothing new, I'm seeing more and more companies jumping on board; due mainly because companies like Google have made it very easy and very quick to set up A/B and multivariate testing.
Today I have the pleasure of bringing you an interview with the SEO communities "foul-mouthed contrarian" and the man behind #arnieseo, Barry Adams, so without further ado...Enjoy!
Today I'm excited to bring you an interview with James Agate of Skyrocket SEO you probably already know of James but if you don't I highly recommend you sign up for his Guest Blogging Track it's one of the few email subscriptions that I haven't binned as it's always full of high quality goods! Anyway lets get cracking with the interview.
Just a quick one today. I have been using Twitter lists for a little while now and playing with how best to use them for increased engagement and as a way to help manage an account as optimal as possible.
Meta descriptions are often over looked, even SEOmoz have theirs automated, Meta descriptions are annoying, you finish a blog post that has taken you a few days to write and then there is 150 is characters of hell left to go.