Introducing YippieMove '09. Easy email transfers. Now open for all destinations.

We’re aware that the images in our RSS and Atom feeds are not showing up in their expected positions. The problem is that the CSS that positions them isn’t getting out there correctly, and in the attempts we have made to inline the CSS, it just hasn’t worked with the readers we tried it with. We hope to have some kind of solution to the weird looking feed problem soon.

Author: Tags:
Introducing YippieMove '09. Easy email transfers. Now open for all destinations.

A small update of Cuzimatter, our social bookmark utility, went online today. The improvements are,

  • Polished user interface.
  • Support for Yahoo! My Web 2.0 bookmarks.

We know there are a few utilities like this one out there, but we still think Cuzimatter is among the easiest to use. There are no plugins to install, nothing to download. Just click and go.

Author: Tags: , ,
Introducing YippieMove '09. Easy email transfers. Now open for all destinations.

Recently, Wikipedia announced that they would add the nofollow tag to all outbound links from its site. Ostensibly this was to take away the incentive to use Wikipedia for spam, e.g. where shady companies would post unrelated links to themselves on Wikipedia pages. But as a side effect, this will greatly increase the page rank of Wikipedia, at the cost of every other site on the internet. In fact, Wikipedia has stumbled upon an amazing tool for spam and Search Engine Optimization. Read on to find out how it works.

Most websites these days are very concerned with being prominently visible in search engines. For Google, the number one measure of a page’s importance is called ‘PageRank’, and thus this is what everyone wants to have more of. Presumably both Yahoo! and MSN Live Search use similar ranking techniques.

PageRank (abbreviated PR) is loosely based on the number of incoming links to a webpage. More precisely it depends on how many pages point to a particular page, and what PR those linking pages have in turn. The best known and most detailed explanation of this seemingly circular definition might be Ian Roger’s “The Google Pagerank Algorithm and How It Works”.

At the end of Mr. Roger’s article the following advice is presented,

If you give outbound links to other sites then your site’s average PR will decrease (you’re not keeping your vote “in house” as it were). Again the details of the decrease will depend on the details of the linking.

So if you want to have a high page rank, part of your strategy may be to not link out. This is easier said than done though. A website that could get away with no outbound links whatsoever, and still be interesting, would be a rare website indeed. The very idea behind the WWW, World Wide Web, is that it is like a web of links. Without links, a website wouldn’t be more interesting than an ordinary printed sheet of paper. So you have to link, and losing your hard earned PageRank thus seems to be an inevitable consequence of making a normal website.

The challenge a website owner is faced with is this:

  1. You have a website and you want it to be popular.
  2. You want to have a higher PageRank because then people will find your site.
  3. Using outbound links reduces your PageRank.
  4. You must have outbound links or your site will be rather boring.

So what to do? Simple. Link a lot to yourself and not a lot anywhere else. To illustrate how this is normally done, I will briefly describe two traditional techniques below.

Most blogs allow users who make comments to add a link to their own website or blog. Since this leads to many outgoing links, a comment section can be described as a ‘PR drain.’ The technology blog Engadget works around this by not letting the user link their name to a website when commenting. While I won’t say that Engadget uses this technique intentionally, this is an example of something that helps to keep page rank bottled up inside of a site. Engadget and its sister blogs also employ enormous link lists at the bottom of every page so that even if there are a few outbound links, they are dwarfed by the large number of links pointing back at Engadget itself or to other sites within the network.

Some sites, like Ars Technica, isolate their outbound links by putting the link intensive comments section for each article on a separate ‘discussion’ page. Notice that how at the bottom of this article at Ars, there is only one single ‘discuss’ link followed by internal links to other Ars Technica articles. Most other technology news sites would have a live section of comments in this area, bur Ars avoids this PR drain gracefully with their ‘discuss’ link. The effect is that whenever an Ars Technica news entry gets a higher PageRank because of people linking to it, almost all of that rank stays within the site. (Perhaps this is unintentional; Ars has a forum which they promote by putting their discussions there.)

All of these methods have two things in common. They’re obvious at a glance, and they do at the end of the day pass at least some PageRank on to other sites.

The ‘nofollow’ method is much less obvious. Nofollow is a ‘tag’ you can add to a link so that search engines won’t take note of it. It is invisible to the user and does not affect their experience in any way. The nofollow tag was brought to life by Google, who back in 2005 announced that they would disregard nofollow links. The announcement can still be found on their official blog. The reason Google introduced this policy was to give webmasters a tool to discourage spamming with. If all user entered links had the nofollow tag added to them, the links would be less useful to spammers. Even if a spammer put hundreds of links to their site in some blog’s comments, the site wouldn’t become any better ranked. All of the links would get the nofollow tag and the search engines would disregard them.

Because of their invisible property, nofollow is the ultimate page rank retaining technique. If a site went ahead and put nofollow on every single external link on the site, it would become a site from which no PageRank would ever ‘leave’. Every incoming link would add rank to that site, and the site itself would never add rank to any other site. Previously the only way to achieve this effect would be to simply not have any outgoing links, and the site would suffer from it. With nofollow you get the best of two worlds. You can link like there’s no tomorrow, and make your users happy, while at the same time you can tell search engines that you couldn’t care less about the sites you link to.

And this is exactly what Wikipedia has done. As most people using search engines are aware, Wikipedia is often at the top of the search results for almost any relevant query. People like to link to Wikipedia. We have done so ourselves here at Playing With Wire from time to time. And now that Wikipedia has gone into nofollow mode, it will never ever let go of the rank you give it by linking to it.

As other bloggers have pointed out, some quite angrily, this will have widespread repercussions. Wikipedia becomes a black hole of PageRank. Search engines are affected negatively. If a majority of the sites on the internet started to use nofollow, then what would the search engines have to work with when determining the most popular site?

Wikipedia claims they made the change to reduce spam, and I believe them: this might have been their intention. But at the end of the day Wikipedia has greatly increased its own PageRank at the cost of the rest of the internet. And in doing so, Wikipedia has shown the dark side of nofollow. Even as I write this I am sure there are greedy site owners combing their whole sites and adding nofollow tags to every external link, following Wikipedia’s example. Wikipedia has effectively demonstrated the ultimate PageRank retaining technique. Indeed, Wikipedia has perhaps taken the first step towards a future internet where no-one links to anyone in a search engine compatible way, just in order to hoard the precious currency of the internet: PageRank.

What do you think? Is Wikipedia’s new policy an honest spam reduction effort or a masterful Seach Engine Optimization move? Will every site on the internet soon be using nofollow? Will Google have to retract their nofollow policy to save their search system from breaking down?

Author: Tags:
Introducing YippieMove '09. Easy email transfers. Now open for all destinations.

That title sure caught your attention, right? It’s not as bad as it sounds. I’m not a Black Hat hacker, I just enjoy reading security related books.

Back in 2003 the famous/notorious hacker Kevin D. Mitnick released his first book called The Art of Deception which discussed different elements of security that relate to social engineering. When I read the book back then I was really both chocked and amazed how easily a well-skilled social engineer can gain access to the most sensitive type of information.

In his latest book The Art of Intrusion, Mr. Mitnick moves on from social engineering to discuss digital security. Since I personally have much more experience with digital security than with social engineering, the techniques used in the stories were not that exciting. The Art of IntrusionHowever, the plots of the stories were quite interesting. In a couple of the stories in the book the reader gets to follow security consults who work on penetrating various companies (the company names are not mentioned). Even though the techniques used by these consults were maybe not shocking in any way, the way thought was. The guys in these stories really know how IT-administrators at midsize and large corporation think, and where they’re likely to cut corners and be lazy.

- Did you disable all network ports that are not in use?
– Did you change the default password on all your network-equipped devices?
– Is your internal voice-mail system using the default password?
– Did you install the latest patches on all you servers? Even the internal ones?
– Did you disable all services that are not in use?

Even though the book brings little new technical knowledge to a tech-savvy person, it shows you how a skilled hacker can obtain important information about your system with, what you think is, trivial information.

Verdict: I would recommend this book to anyone who works with technology or security in a corporate environment. Also, if you haven’t read The Art of Deception, I’d also recommend you to read it.

Author: Tags: ,
Introducing YippieMove '09. Easy email transfers. Now open for all destinations.
Comments Off
Category: Uncategorized

You may have seen our farewell letter to blogger. And yes, it is true. We have switched from a Blogger system to a WordPress system.

The transition was made during the late hours of Saturday night. We hope we have caused as few disturbances as possible – we have even relinked every old article to its new permanent address (by hand none the less!). Still, let us know if there is anything that is broken or doesn’t seem to be working like it should. Your feedback is greatly appreciated.

So why did we make the switch? Well, as the first post hinted, we have had trouble with Blogger’s stability. But this was not the main reason we switched. The main reason was that even that we were hosting our own published version of our Blogger blog, there were a few links that were part of every page that went back to material. And despite Google’s legendary connection speeds, did not seem to get any of that. Time and time again we saw situations where the front page was not loading for several seconds as the web browser was waiting for a file. And whatever they were doing, they didn’t even seem to have cache control enabled. So these files would be fetched over and over again, possibly with a multiple second delay.

We don’t think WordPress is faster than Blogger. In fact, I’m very certain that WordPress is slower by an order of magnitude. Blogger generated static pages for all content. Every time a comment was posted, the relevant post’s static page file was updated. Every time a new post was made, old pages were regenerated with the relevant links to the new page. This of course is optimal. You can hardly make a web server any faster than it is when serving static pages, especially not with the right Apache configuration.

So in theory Blogger was extremely fast, and we will be taking a performance hit by switching to dynamic pages with WordPress. But in practice, Blogger was often very slow due to those few non cacheable header links. It shouldn’t take seconds to load a single page, especially not if all images are already cached.

So we didn’t have a choice. Blogger was pushing us to upgrade to the new version of Blogger, and if we did that we wouldn’t be able to use WordPress’s built in Blogger import feature, which we ultimately used to get all old posts and comments over to the new system.

That said, the switch wasn’t entirely because of Blogger’s drawbacks. There are some very nice things with WordPress. For example, you can create pages like our About page. This is a great touch of CMS functionality that saves us from the trouble of theming random pages by hand.

Next we will look into generating a new sitemap and page caching. We hope that your Playing With Wire experience is faster already though.

Please let us know if you find anything that doesn’t seem to work as expected.

Update 1: We did try to contact Blogger about the performance issues two months ago, but we never heard back from them.

Author: Tags: ,

© 2006-2009 WireLoad, LLC.
Logo photo by William Picard. Theme based on BlueMod © 2005 - 2009, based on blueblog_DE by Oliver Wunder.