About a year ago, one of our cats decided to take a drink of water. She did this from a glass that was on my desk, and in so doing she tipped it over and gave my iPhone a bit of a bath. Thankfully, the water was shallow and my phone wasn’t destroyed, but it was damaged and the home button hasn’t worked right since.
I’m flying out to PAX in Boston in a mere seven and a half hours, and the only electronics I am taking are my Kindle and my iPhone. I’m going to try something new this time around and blog exclusively through my phone, but I won’t be doing it here.
What with the recent uproar about Google’s take on net neutrality (it’s alright for some, but we’ll just make our own Internet over there), it’s fair to question if our behaviour will change any. Ryan posed this question yesterday:
So, here’s a question: Will you stop using Google’s services? *Can* you stop using Google’s services? Has the benefactor become the beast…
Despite having written about it a few days ago, I suspect the whole thing is a lot of stress about nothing. Nothing has happened other than Google and Verizon saying, “Hey, let’s do this thing.” Sure, they have the 700mhz spectrum and Google has… well, everything. But that doesn’t mean they’ll wield their power for evil.
We may be headed towards an apocalyptic break down in our rights and privacy, but for now I’m not going to unplug. I recently switched my email to Google, and they’re still doing pretty well by me. I worry about the destruction of the Internet as we know it, but I remain confident in two things. The first is basic capitalism, and the second is the plight of the lazy idealist.
In regards to capitalism, Google and Verizon are going to do the thing that makes money. They can only go so far by pissing people off, so they can’t do anything overtly terrible or their customers will leave. I think they’ll keep this in mind, and I don’t think we’ll see the destruction of the Internet or anything so grandiose as that. Their parallel Internet may not even materialize, but if it does I suspect it will be a good thing.
Which brings me to the lazy idealist, which is an apt term for myself. I like open source software, Linux, and all the philosophy that goes with it, but what computer do I use most? My MacBook, arguably the most closed source system I could get. The reason I like it is because it works so well, and I suspect the same would be true of whatever Google and Verizon cook up. Maybe they will make something unholy and terrible, but if they do it’ll probably also be delicious and filled with magic. As long as the company takes care of its customers positively, I can’t complain too terribly much. Sure, I like freedom, and Apple gives less than Linux, but I also like things that are pretty and work well and are compatible with the work I have to do.
In the end, there’s no point speculating. We’ll see what comes of this and decide when the deciding time comes. Until then, I’ll keep using the best tools that are available to me, and for search and email that remains Google. Could I leave if I needed to? Absolutely–without a hitch or a problem. But I doubt it will come to that.
Like most everyone I know, I wasn’t sure what to make of Twitter when I first began using it. I didn’t know anyone on it, and I wasn’t sure I’d really use it all that much… and now I’ve sent well over 4000 messages through it, it’s my only means of communication with PAXers, and I’m not sure I could survive without it.
What’s more, Twitter has become an extension of my website. Whenever I publish a blog entry, it pops onto Twitter. It’s primarily through Twitter that I keep in touch with people from PAX, and for many, it is only through Twitter that they know me. Like my site, it ought to reflect my personality a bit more than a stock background, and perhaps more to the point, it ought to match up with my site.
I’ve written before about the importance of branding, so even though I don’t run Twitter, I should put my personal stamp on it. The first place I turned for help was Google, and it led me to the article Make a Good Impression with a Custom Twitter Background on Twitip. This article has some excellent advice, so if you’re looking to make a custom Twitter background, this is pretty much all you need.
I would offer one additional tidbit, though: Darren writes about the potential screen resolution your visitors will be at, and how that can affect your Twitter background–while taking this into account, also take a look at your Google Analytics results (which you ought to have set up) where you can see what resolution visitors to your site are using (see screenshot). I ended up going with a 200px wide branding spot; 120px was simply too narrow, and I think it’ll work for enough people to be worthwhile.
There is more openness to new words these days than there once was. People seem more apt to jump to the argument that, “Hey, if I understood what he was saying, then it works!” and worry less about proper words or grammar. Whether the cause is a failure of the public school system or the rise of the Internet and text messaging, the general populace (and of particular note is the inclusion of those with some or total completion of a degree in higher education) are coining new words regularly that gain such traction they enter the popular lexicon. Dictionaries have added google as a verb and are considering lol. It’s only a matter of time.
As for me, I resist these trends. I continue to say that “I searched for something on Google,” and I avoid using acronyms in everyday conversation whenever possible. When I send a text message, I type it properly, character count be damned. Despite my best intentions, though, Twitter has broken my will.
I love this stupid little service, and it has been invaluable to me. Whenever I post a blog update, my Twitter status is updated. My Twitter updates are pushed to my Facebook status, and I have been able to follow a great many people attending the Penny Arcade Expo so I can communicate and connect with them. It has helped me professionally as I have received assistance on technical issues, and it has certainly increased the feedback I have gotten on life events and writing. In general, it has helped my communication with others online.
With a limit of 140 characters per message, though, space is at a premium. I realized early on that if I had something to say, or wanted to make note of something important (say a news article I had read), it was better to write about it on my blog and have Twitter post a link to my site. This not only earned me more traffic, it cut down on Twitter spam as compared to some people who just post message after message. In addition to this realization, however, I have also been working on concision in my messages and learning where I can cut unneeded words and characters. I maintain writing proper sentences (most of the time), but certain phrases are creeping into my vocabulary.
Hence the title of this post. For months I have kept on with the phrase, “I posted on Twitter,” or, “So-and-so posted on Twitter,” but I just don’t have the space for it anymore. Modern technology and its emphasis on concision and time-savings is wearing on me, and I must bend to its will.
I haven’t decided how this will affect my other writing yet. I lean towards concision as a general rule and don’t think a book need be 500 pages to be considered a novel. My rule of thumb is to shoot for 100 pages, and most blog posts are around 500 words where once they were over 3000. I focus on keeping things shorter and to the point, but will I take it even a step further and cut things down more? Only time will tell.
Late last week a fellow Penny Arcade Expo (PAX) attendee asked on Twitter about Google Calendar. She was having trouble figuring out how to organize a lot of different information in that interface, and since I use Google Calendar (GCal) extensively both for work and personal organization I asked what she was working on and how I could help. As it turns out, she was wanting to make a calendar for the Expo in GCal to make it easier for people and I jumped on the idea. I thought it brilliant and asked her if I could contribute.
Subsequently I have filled in all of the PAX events I know of on two separate calendars, one for official events and the other for unofficial. At its simplest, I like Google Calendar for this sort of thing because a standard calendar is how my brain likes to see dates/times. I have trouble looking at an agenda, itinerary, or gantt chart to figure out what is happening and when. Seeing it on a calendar is much easier for me.
You can do a lot more with these calendars than just look at them, though. Allow me to demonstrate how these can really help you in preparation for PAX.
Subscribe to Calendar
If you have a Google Calendar account (and if you don’t, might I suggest getting one?), you can simply add these calendars to your own account by subscribing to them. First, under Other calendars, click on the Add button and then Add by URL.
For the URL, copy and paste the following for the [PRIME] PAX 2009 Official Events Calendar:
And for the [PRIME] PAX 2009 Unofficial Events Calendar, you can use:
You can also use those URLs to just see a web page with the calendar, if you prefer to avoid Google.
Update:: Not sure how I missed this originally, but an even simpler method of adding the calendar is to use the button on the calendar web pages. Just click on one of the links above and then the +Google Calendar button at the bottom right of the calendar.
Add an event to your own Calendar
Now that I have the primary calendars in place, my plan is to go through and add the events I actually plan to attend to my personal calendar. For instance, I’ll go through and click on the panels I want to see, then copy them to my own calendar, a la:
Just click on the time bar (rather than the actual name of the event, which is a link) to pop up this box.
Want the full details of an event? I’ve got everything in for official events and will add more to the official events as I read more. Just click on the name of the event to open up the full details.
Got an iPhone?
Google has very nice iPhone views for most of their applications and the Calendar could be quite helpful at PAX. A warning about relying on this though: the wireless, 3G, and Edge networks are sure to be pushed to their maximums and beyond during PAX as 75,000 of us descend upon Seattle, so it would be best to sync all this with either your iCal or Outlook calendars to have locally on your phone. Still, the web interface can be quite nice (if it’s working).
Check your schedule…
or see the details of a specific event…
Note about Time Zones
The time zones of the calendars are set to (GMT -08:00) Pacific Time. If you do not modify your Google Calendar time zone to match that, you will be seeing events in your local time zone. Therefore, for me in (GMT -06:00) Central Time, I see events happening at 9 a.m. as beginning at 11 a.m. I plan to just change my GCal time zone before flying out, much as I might update a watch upon arrival to match the local time. You ought to do the same.
Just leave a comment and I’ll do what I can to help 🙂
I’ve spent a lot of quality time with the Google Webmaster Tools (GWT) this week, and it has been an altogether frustrating and enlightening experience. The bottom line is that it is showing my site as having a lot of errors of the 404 – Not Found variety, and this caused a bit of concern because 250+ of those has got to be hurting my search engine ranking.
It is additionally frustrating because I’ve gone to great lengths to prevent this very sort of thing from happening. I use Robots Meta to prevent certain pages from being indexed by search engines, All In One SEO to create meta data, and Redirection to make sure modification or deletion of posts doesn’t cause any disruption. And yet, there they are, staring me in the face. A bunch of pages that can’t be found and are returning errors. First, I’m going to talk about where these errors came from–because not all errors are equal–and whether they actually need to be fixed or not. Second, I’ll let you in on the secret to 404s and SEO.
What causes the errors?
GWT admits that not all errors are really a problem with the text:
Note: Not all errors may be actual problems. For example, you may have chosen to deliberately block crawlers from some pages. If that’s the case, there’s no need to fix the error.
If you have deleted a post or page, updated your sitemap, and you consider the case closed, you probably don’t need to worry about it. Eventually Google will stop trying to reach the link and the error will disappear all on its own. The problem is if you have other pages on your site that link to those you have deleted. GWT will tell you what those pages are, and you should edit them to remove the offending links.
This is probably the most benign of the errors because you can see it coming. Others are more mysterious.
Related Posts Plugin
Similar to the last, Related Posts plugins (I use YARPP and rather like it) don’t generally set all of their links nofollow, so they generate a ton of internal links on your site. These links aren’t generally set to nofollow because 1) they’re internal and 2) if you delete a post, Related Posts will update automatically and won’t link to the deleted post anymore. Unfortunately, Google has indexed that Page A links to Page B, so when Page B gets deleted, Google decides there’s an error. This, too, will pass in time as Google catches up, but it’s something of which you should be aware.
Back-end or Codeish Errors
I have no idea what causes these or where they come from, but GWT claims that a lot of my pages are linking to things that simply don’t exist. Namely, some pages are supposedly linking to */function.include, but near as I can tell, there are no links on the originating page that point at */function.include. This would point to there being a problem with the theme I’m using–maybe it has some code pointing to the wrong place and that’s throwing errors–but if that were the case, the errors should be happening from every single page, not just a few.
I went through and manually removed these links from Google’s index, but I’m skeptical of that solution. I’d rather know what is causing it and get it fixed, but this issue is so perplexing that I don’t know how. The good news is that actual users of the site aren’t attempting to follow these links because they don’t really exist on the page, so while the crawler may have trouble, the readers won’t.
This one is more because I’m spastic than anything else. For those of you who have followed this site for a while, you might recall that it has undergone significant changes in the last four years. I’ve gone from WordPress to Mambo!+WordPress to Joomla!+WordPress and then back to WordPress exclusively. I have created a dozen different sub-sites, spin-off blogs, forums, wikis, etc., and consequently deleted those blogs and come back to just having the one centralized site.
As such, I should have gone back and edited my robots.txt to exclude… well, pretty much everything. I’ve done that now, in addition to removing those links from Google’s index, so hopefully that will take care of it.
Combining WordPress blogs
When I closed the blogs I mentioned above, I usually imported their posts into my primary site. This causes so many headaches if you’re not careful, so be prepared to sort out the kinks. GWT’s ability to tell you where the errors are happening is great for going back end editing posts to remove or update links, but it’s definitely a manual process. There is simply no way around fixing this stuff: you’re going to have to set aside a block of time, sit down, and get it right.
This one originally perplexed me, as I had pages and pages of errors due to Pagination. This is where you’re browsing through the site and you’re on */page/108, and you can go to either */page/107 or */page/109. When I was typing this, it finally hit me what caused this: going from a single blog post on each page to 5 or 10. I suddenly have less pages, but Google hasn’t caught up yet and is still trying to hit those old links. It’ll learn eventually.
So, do 404s hurt SEO?
That depends, as I alluded to above, on whether they are internal or external links that are Not Found. Search engines won’t penalize you if other sites link incorrectly to your content and those links can’t be followed. If they did penalize you for that, then spammers or trolls could create sites with massive amounts of broken links to any site they wanted and drop its pagerank immediately. This obviously wouldn’t be fair, and thankfully search engines don’t work that way. Regardless, it is best to have a custom 404 page to deal with external links that 404. The key is making sure that actual people (rather than bots or crawlers) find your site helpful and get to the information they need/want.
Internal 404s will most certainly cause harm, and that’s where GWT can be of great benefit. By displaying not just the pages that can’t be found but also the pages that link to the 404ed, it helps you find the pages and fix them. As far as search engines are concerned, if your site can’t maintain internal link integrity, it isn’t trustworthy or helpful, so why would they send people your way? If Google started sending people to a bunch of broken sites that didn’t work well, people would stop trusting Google to provide good search results and they’d use a different search provider. That’s why the search engine checks to make sure sites are holding up and working well, and if the site isn’t, it’s pagerank will drop.
Maintaining internal link integrity is essential, not just for SEO, but also for keeping you readers happy. If someone clicks on a link on your site that goes to your site, they expect that link to work. When it doesn’t, no custom 404 page is going to make them happy. They might accept one error, but beyond that they’re more likely to just surf away.
While it would be ideal to never generate errors, chances are you’ll have at least a few if you’ve been around for a while and actually do something with your website. After 4+ years of active development and changes and well over 300 blog posts in just the last year and a half, these things happen, so I’m going to try to not let them get me down. Use the Google Webmaster Tools to your benefit and get your errors sorted. The work will be worth it in the end, and both the crawlers and your users will be happier when they are able to breeze through without hitting brick walls.
And once you get them taken care of, make sure to check back with GWT regularly to make sure the problem never gets out of hand. Once I get this all fixed, I’ll be logging into GWT at least once a week to make sure nothing new has cropped up. I am confident that my pagerank will benefit from the dilligence, and it’ll make my readers happier to have a site that functions entirely as it should. For that happiness, it is well worth the extra work.
Most people who follow technology news are familiar with the media’s conception of Microsoft as the Big Bad Corporation while portraying Google as the heroic underdog that has come to save the day. Google’s motto of “Do No Evil” has garnered a lot of good press for them, but over the years (and billions of dollars later), they are seeming more and more like the other big players in tech software. Despite their increasingly underhanded actions, though, they continue to get good press.
I was a bit shocked at the article I read on ZDNet this morning about a study that said silent patching is more secure and Google was awesome for doing it. Recognizing that I often rip on ZDNet for their writing, I want to state up front that this article is almost everything that is right about journalism. It stayed fairly neutral, presenting the facts and letting the reader reach their own conclusion. However, even the minimal back-patting I perceived towards Google set me on edge. What’s more, the study the author is using to defend the premise that silent patching (which Google apparently did recently with their web browser, Google Chrome) was co-authored by Google.
It’s amazing how positive your company’s security practices can look when you write your own security papers that state that what you are doing is The Right Thing™.
A couple of years ago, Microsoft performed a similar stealth update that caused many to pull ye olde pitchfork out of the hay pile and begin gleefully brandishing it at the sky, railing at their over-confident god. Torches were lit and the ground shook under the tramp of the marching masses, though in reality nothing much came of it. People complained and posted online and that was that. Linux perhaps won a few more converts (I had already left Microsoft completely by that time, but it certainly boosted my self-satisfaction).
There are certain things I agree to when using free software, and I recognize that I grant the company providing that software some liberties in exchange for services. I expect and accept that Google mines my email so they can better sell me advertisements–if I really cared about my email being private, I’d run my own encrypted server and that would be that. But I like GMail and I’m willing to part with certain liberties for its use. When I install a piece of software on my personal computer though (rather than just running it through a web browser), I come away with a set of expectations. Even though I downloaded the software, it still asked me if I wanted it to be installed. When my word processor or music applications release updates, they ask if I want to upgrade. Similarly, when my web browser needs an update, I expect it to prompt me.
What’s more, software updates can and do break things. This is why we turn Automatic Updates off for all of our servers. Yeah, updates are important and need to be installed, but you don’t just put that stuff on willy-nilly. If an update breaks an existing service, that needs to be sorted before you roll it out. IE 8 is now a critical update for Windows XP, but if it didn’t work with our software, there’s no way we would install it. Just the same, we need to test all updates before installing them, and that ability is taken away when the company pushes a stealth update. Silent patches risk the integrity of our work and the services we provide, and should simply never happen.
In the end, this is my computer, not yours. You don’t just install stuff on it without asking me. That’s not what we agreed to, and if you think it is, then I’ll just have to walk away. You can be sure, Google Chrome won’t be installed on another computer by me in the future.
Though I am an avid collector of site statistics, spending hours playing with Google Analytics on both my personal website and the ones I run for the university, I don’t really do much with the stats. They don’t drastically shape the way I run my sites, and though I find them intriguing, I don’t do much beyond becoming intrigued. And because I have Google Analytics running, it doesn’t often occur to me to log into Google’s Webmaster Tools to see how things are going on that end. I’ve got a sitemap in place and everything’s solid, but until recently I never thought much about the health of my site.
When I do look, however, I notice that things aren’t as wonderful as I’d like. GWT tells me that there are 198 errors, which sounds pretty serious, and I’ve read elsewhere that an abundance of errors like this can really hurt a site’s pagerank. What’s worse, that number (198) is pretty new. The amount of errors seems to be growing. I suppose I shouldn’t be surprised, what with the recent overhaul of SilverPen and all, but a lot of pages aren’t being found. Even more frustrating, a lot of the pages that can’t be found are ones that never existed to begin with. I also find it odd and rather perturbing that Google uses the sitemap as more of a handy reference than as the set of instructions I had intended it to be, so it is indexing some subdirectories and sites I don’t use or particularly want indexed. I’m not entirely sure what Google’s smoking.
I decided to clean things up a bit to see if I could reduce the number of errors I have listed. I didn’t hit all 198 issues, but the ones causing the most errors and some of the more obvious ones have been fixed, often with either a 301 redirect or by editing the page to fix whatever was wrong with it. I’ll know in a few weeks whether this helped or not–unfortunately, you can’t just reset Google’s findings and tell it to crawl your site again.
Remember back when SilverPen was actually five blogs (Reader, Writer, Tech, Theology, Main)? That was only 1.5-2 years ago (though it definitely feels longer), but let me tell you, it was a bad idea. The whole thing, total failure, shouldn’t be done again. As I have been wont to say lately, “If I knew then what I know now…” *shakes fist and waves cane in the air*. I have been slowly working to reintegrate the blogs back into one, because in addition to those five there was also a later poetry one, and one for stories about the elven character Arias Stormsworn, and one for entries directly related to being newlyweds (written by both April and me). At long last, I have all those shut down, cleaned out, and brought under one roof.
They were really bugging me, not just because of the page errors, but because they represented security risks. Because I wasn’t using or signing into them, the software and plugins got further and further out of date, and that is always a hazard. What’s more, I felt I had this creep of databases and RSS feeds that was getting out of hand. When I look at my DBs and I’m not sure which is which, that raises a flag for me. Now everything is pared down where I know what is what and it’s all solid. In addition to all this, I also signed into April’s site and added some plugins to manage her SEO and tighten everything up, which should help with SilverPen Publishing’s overall health.
I have discovered the wonder of how WordPress manages RSS feeds, so I have created a couple of specific feeds for the items that need them. If I’d known this was possible years ago, everything might have been different. Still, I won’t let that get me down. It’s nice to have everything cleaned up and ship-shape.
FeedBurner was purchased by Google some time ago, which was comforting news to those of us using FeedBurner for our RSS feeds. When you rely on a third party to handle part of your site or business, there’s always the risk that they’ll go out of business and leave you in a lurch, but that’s not really a concern with Google. They’re pretty solid, so there are no worries.
However, the other shoe has finally dropped and, in the transition from FeedBurner to Google’s FeedBurner, feed addresses are now changing. For the moment, the old address will forward to the new one, but that will presumably go away eventually, breaking the old address and forcing people to update.
I wanted to let you all know in advance so you can get your RSS reader(s) updated. The new address is http://feeds2.feedburner.com/SilverpenPub
For more information, see: FeedBurner is Dead, Long Live FeedBurner