Is using asp.net bad for SEO?
There is a common feeling amongst SEO experts that asp.net is bad for your website. But is that true? Not necessarily as we will find out.
This article is not aimed at any specific version of .net. Whether you use version 1.1. 2 or 3.5, we are all subject to the same potential “downsides” of .net, but with care, we can work around these without a problem.
There are two potential problems with .net from an SEO perspective. One is that prior to version 2 the framework was not XHTML compliant, the second is the use of JavaScript post backs from server controls.
ASP.NET 1.1 and XHTML
The first issue is not an SEO problem in itself but more of an approach problem. Many of the asp.net server controls that we use in .net 1.1 render themselves as fairly bad html using proprietary attributes, invalid, unsupported or deprecated code that hasn’t a chance of validating to HTML 4 let alone XHTML. This doesn’t exactly encourage the use of good standards compliant mark up. It also does not support XHTML is any way so there are going to be problems creating a fully Standards Compliant website. The way to get around this problem is to either change attitude and try and produce the best mark up possible, or to upgrade your web application to .net 2 which is XHTML compliant. There is loads of information around about how to do this and it can be extremely easy, but this Microsoft document is a good place to start.
JavaScript Post Backs
The second issue is to do with JavaScript post backs. These are mostly problems with asp.net datagrids and the html code they can generate. When you use a datagrid you are using a very powerful server control that makes an essentially difficult and time consuming task into a straightforward and quick one. The downside to this is when you start using LinkButtons in the grid to perform actions or create hyperlinks, the HTML link that is generated is a JavaScript link that cannot be followed by the major search engines. This can result in a dead end as far as a search engine is concerned. There are two solutions to this:
- Use a sitemap. A full sitemap will ensure that the pages that are linked to are included. You may have to create a manual sitemap for this purpose but it at least gives you a chance of being indexed. The downside is that whilst a search engine might index the pages, they will as far as it is concerned, be orphan pages because it had no idea how it fits into the site structure.
-
Use a repeater. These are the most manual, but most flexible from a design point of view. You can bind in the same way but the HTML that is rendered is completely controlled by you. So you can use proper links with the field embedded in it, just as you would have done if you were using classic asp:
<a href="widgets.aspx?widgetnumber=<%#DataBinder.Eval(
Container.DataItem, "widgetnumber")%>Widget 1</a>
Conclusion
ASP.NET does not have any limitations for SEO when compared to PHP. Its just it is far easier to make a site SEO friendly in PHP than it is with .net – especially when you consider URL rewriting. .NET is getting better at this, and lets also not forget that .NET is the more powerful and scalable language, so the workarounds and extra effort are worth it if your web application requires it.
Donnie
We’ve found that changing some of the datagrids to repeaters was very straightforward, and really should have been done in the first place. We are getting much more of our deeper pages linked now than we did before, so are seeing a much longer tail on our ‘long tail’.