Tuesday, May 23, 2017

Hidden Text (Revisited)

Just over a year ago I posted on the dangers of hidden text and concluded with the advice “…don’t use hidden text to try to improve your rankings“.

Here is a practical example of what may happen if you do.

Yesterday John Frost who runs the very popular Disney Blog posted that his blog had been delisted from the Google index and sure enough it had:

web

Such is the power of popular blogs that within a couple of hours of John’s plea for help their was an explanation and a resolution from none other than Google Engineer and spam fighter in chief, Matt Cutts. He explains in a diplomatic and friendly comment that hidden text was responsible for the ban. Specifically this page code:

<h2 id=”banner-description”>Informing Disney Fans the World Over with the latest news and updates from all Disney companies, divisions, and related stories. Disney World, Disneyland, Disney Cruises, Disney Animation, Pixar, ESPN, and more are covered in as much detail as I can muster.</h2>

With this in the external CSS file:

#banner-description
{
overflow: hidden;
width: 0;
height: 0;
margin: 0;
padding: 0;
text-indent: -1000em;
}

As it happens this appears to be a generic Typepad problem in that when you set up a Typepad blog you are asked to enter a Weblog description which ends up being hidden by the CSS. However after Matt had pointed it out and John had removed the text, Matt helpfully submitted a reinclusion request.

Matt has gone off to talk to Six Apart the Typepad developers and The Disney Blog will be back in the index sometime next week.

The moral of the story is still the same – don’t use hidden text to try to improve your rankings.

The post Hidden Text (Revisited) appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2rcIRJw
via IFTTT

Wednesday, May 3, 2017

What are keywords?

Keywords or keyphrases are the search terms that a user types into a search engine text box in order to find information relevant to their search. For example if you search Google for |chess| you will see something like this:
chess

This is called the Search Engine Result Page or SERP and Google tries to put the most relevant result first, then the next and so on. What you see is the top ten results out of, (in this case) over 24 million pages ranked in order of relevance.

Google can’t know if the user is looking for something more specific like for example; chess sets, chess clubs or the rules of chess, so the results will be a broad range of pages related to chess in some way.

A user seeking a chess club in Chicago is more likely to search for |chess club chicago| in which case they would see something like this:

chessclub

Notice how for this more specific search the number of candidate pages has gone down from over 24 million to around 1.8 million. As Google says “Choosing the right search terms is the key to finding the information you need”.

The obvious corollary for site owners is that choosing the right keywords to optimize for is the key to maximising the number of visitors and conversions (the percentage of visitors who take a desired action like buy a product or subscribe to a newsletter). In general the higher you are in the SERPs the more visitors you will have and the more specific the keywords the higher the conversion.

Users search in different ways with different words and site owners need to know what these keywords and phrases are for their particular business.

The post What are keywords? appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2p8t21J
via IFTTT

Saturday, April 22, 2017

Keyword Research

Keyword research enables site owners to choose keywords when constructing and optimizing a website (page). Keyword research is also extensively used to manage PPC advertising campaigns or if you are looking to research and identify profitable niche markets.

When performing keyword research for constructing and optimizing a website (page) you are looking to select search terms that will reach your target audience. One of the common mistakes made by some SEOs is to avoid choosing keywords that are very competitive. Searchers tend to use a lot modifiers when they search and the more competitive the search term the more likely they are to use a modifier. Removing the possibility of ranking well for these modified searches is not a good idea. Competition can be a factor in deciding how to target a specific search term but you should never ignore a search term simply because you believe it is too competitive.

There are two basic tools for the site owner:

1. Digital Point Solutions Keyword Suggestion Tool

laptopbattery

Digital Point’s online keyword tool compares Overture and WordTracker data side by side. It is free, quick and easy to use although it lacks WordTracker’s more advanced features. Here is a partial screen shot.

2. Google Adwords Keyword Tool

You will need an AdWords account to use Google Adwords Keyword Tool but signing up is easy and well worth it just to use the tool. Although primarily designed for AdWords it is also ideal for use as a simple keyword suggestion tool. The big advantage is of course that it is using the latest Google data and you can find and select keywords based on this data. You can create keywords from a url (i.e. one page), a whole site, a keyword(s) that you enter or for AdWords users the most relevant terms in your account. The results are shown by relevance but can be ordered by Advertiser Competition or Search Volume on a scale of 1 to 5. You can also download the results as a .csv (for excel) file which makes it easy to compile master lists.

Here is a partial screen shot of a list from a keyword.

googlekeywordtool

Here is a partial screen shot of a list from a url.

gkt2

Most site owners will find the above tools sufficient for their needs but if you want to investigate other tools there are basically two kinds, keyword analytical tools and subscription based tools. Here are some examples with a link to the product and a link to a review of the product.

Keyword Analytical Tools:

  • Keyword Analyzer Review
  • The Keyword Bible Review
  • The Dowser Review

Subscription based:

  • WordTracker Review
  • Keyword Discovery Review
  • Keyword Intelligence Review

A word of caution though if you try these tools. The major search engines (Google,Yahoo and MSN) do not make their raw data available to anyone so these products have to obtain data from somewhere else. For example WordTracker uses data from the Metacrawler and Dogpile metacrawlers which represents a very small and not very representative sample of searches. Not only that but in order to estimate figures like the predicted number of searches for a keyword an extrapolation has to be made. In WordTracker’s case they assume Metacrawler and Dogpile account for 0.86% of all search engine queries (a dubious statistic in itself) and scale up the numbers in their database accordingly. This has the effect of compounding any errors in the original dataset and at the very least means that these derived numbers should not be taken too seriously.

A most important source of keywords that is often overlooked is your server logs. Regularly mine your server log data to find the search terms people are actually using to find your site and use these terms to construct new pages or modify existing ones. You can read more about this process in these two posts Long Tail Search and Long Tail Search Tool.

March 1st, 2016 Wordtracker have introduced a free keyword suggestion tool that will generate up to 100 related keywords.

May 11, 2016 Wordze is a new subscription based tool which has some interesting features.

The post Keyword Research appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2pOUSku
via IFTTT

Friday, April 14, 2017

G profile video

https://www.youtube.com/channel/UCKKraAibgkEjOqvxkC2Vw0A

Short case study showing some rankings & traffic.

The post G profile video appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2pfUbTT
via IFTTT

Sunday, April 2, 2017

Search Engine Friendly Urls

It is important to have search engine friendly urls if you want your pages spidered and indexed by the search engines but what does having search engine friendly urls actually mean? Let’s take a look at what the three major search engines say about urls:

Google has three things to say on the subject in its Webmaster Guidelines:

1. If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

2. Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

3. Don’t use “&id=” as a parameter in your URLs, as we don’t include these pages in our index.

Yahoo in their Search Indexing FAQ say:

Do you index dynamically generated pages (e.g., asp, .shtml, PHP, “?”, etc.)?

Yahoo! does index dynamic pages, but for page discovery, our crawler mostly follows static links. We recommend you avoid using dynamically generated links except in directories that are not intended to be crawled/indexed (e.g., those should have a /robots.txt exclusion).

MSN’s Guidelines for successful indexing say:

Keep your URLs simple and static. Complicated or frequently changed URLs are difficult to use as link destinations. For example, the URL http://ift.tt/1iUlcEq is easier for MSNBot to crawl and for people to type than a long URL with multiple extensions.

The message is clear, static urls are better than dynamic but if you have a dynamic site the urls must be as simple as possible, with only one or two query strings and no session IDs.

A url that might look like this:

http://ift.tt/2ntCivH

Should preferably look like this:

http://ift.tt/2n115vH

How you achieve this depends on whether you are starting out with a new site or have an established site with existing complex urls.

If it is a new site then search engine friendly urls must be built into the design criteria. How this will be done depends on the programming language. For example if you planned to use PHP then you might make use of the PATH_INFO variable or if you use ASP.NET then you could modify the Global.asax file.

If you plan to use a content management system (CMS) then make sure that it generates search engine friendly urls out of the box. The Content Management Comparison Tool has a check box for ‘Friendly URLs’ if you are researching CMS tools.

A completely different approach (not approved of by geeks but worth consideration if you are designing your own site as a non-professional) is to create static HTML web pages from a database or spreadsheets but not in real-time. WebMerge for example works with any database or spreadsheet that can export in tabular format such as FileMaker Pro, Microsoft Access, and AppleWorks. Using HTML template pages WebMerge makes a new HTML page from the data in each record of the exported file. It can also create index pages with links to other pages and generated pages can be hosted without the need for a database.

If it is an existing site then problematic urls can be converted to simple urls in real-time. If you are on an Apache server then you can use mod_rewrite to rewrite requested URLs on the fly. This requires knowledge of regular expressions which can be rather daunting if you are not a programmer. Fortunately there is an abundance of mod_rewrite expertise at RentACoder if you get stuck. If you are on Internet Information Server (IIS) then you can use something like ISAPI_Rewrite to rewrite your urls which also requires knowledge of regular expressions.

What ever your solution you should try to incorporate your keywords in the urls and only ever use hyphens, never an underscore or space.

The post Search Engine Friendly Urls appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2opx5tB
via IFTTT

Monday, March 20, 2017

Frames

Much like Gertrude Stein’s painter, web designers have hopes for their framed web designs but are very often disappointed. This is because framed websites do not fit the conceptual model of the web where every page corresponds to a single URL. Consequently designers must use a variety of tricks to overcome the disadvantages and if you miss a trick there can be unpleasant results.

Designers’ intent on using frames may use the NOFRAMES element which can be used to provide alternative content. However not the useless alternative content provided by so many designers such as “This site requires the use of frames” or “Your browser does not support Frames” which is a great way to prevent your website being found on a search engine. The correct use of NOFRAMES is described in the W3C document Frames in HTML documents.

Apart from having to provide alternative content the other major problem is what happens if a search engine query matches an individual frame on a page? The search engine simply returns the URL for that frame and if a user clicks the link then the page will not be displayed in a frame because there will be no frame set corresponding to that URL. Designers get round this by detecting when a content page is trying to display outside its frameset and redirecting to either the home page or to a framed page that loads the orphan into an alternative frameset. If you really want to know how to do this you can read a description of the technique using JavaScript in Give orphan pages a home.

Also framed sites have a problem with obtaining inbound links because it is not easy for someone to link to one of the content pages. Either they must link to the home page and give directions to the page they want to point to or they must bypass the frame arrangement. If it’s not easy to link then only the very determined will be prepared to go to the trouble of doing so.

If you want the framed look but don’t want the problems you can achieve it through cascading style sheets. Stu Nicholls has an excellent example on his website CSS Play (and there are lots of other interesting experiments with cascading style sheets on Stu’s site).

The bottom line is this, if your web designer uses Frames seek a better and more experienced designer and if you find Framed sites attractive in spite of the problems ask yourself why your competitors do not use them.

The post Frames appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2mMeILn
via IFTTT

Sunday, March 5, 2017

Flash

Triumph and dismay are two feelings Flash designers know well. The triumph comes with mastering a rich diversity of features in a difficult technology and producing a visually appealing result. The dismay comes when the SEO wants to remove the Flash components from the website they have just designed!

Initially a simple tool for delivering low bandwidth animations over the Web Flash has evolved into a platform for delivering complex applications and rich media content. Now Flash is able to deliver far more than animations or short video clips.

Flash has become the delivery mechanism of choice for educational and complex business applications. Universities use Flash to great effect for delivering entire lectures with quizzes and assessments in real-time. In commerce Flash is used for everything from cattle auctions to virtual laboratory experiments.

However its use on websites has declined and there are two reasons for this. Firstly every usability study ever done shows that web surfers dislike Flash intensely, particularly Flash intros. Secondly Flash is a visual experience and search engine robots are blind, which means the SEO of Flash sites is problematical. Sites designed around Flash or with Flash intros and Flash navigation are often developed at the request of clients who do not know any better and the developers have not sought to educate them.

Take for example the following site that is completely built in Flash. Although there are several pages of information, because the navigation and the content are all in Flash the search engines are only aware of one page. Here it is in a reduced size window.

This site cannot even be found for the organization’s name and might just as well not exist. Flash enthusiasts might claim that this is just a poor implementation and that it is possible to optimize Flash sites. It is true that there are a variety of methods used to optimize Flash sites and these include placing the Flash inside invisible framesets or using invisible layers in Cascading CSS to present content to the search engines. Macromedia even have Search Engine SDK but in reality none of these methods is entirely effective. Sometimes you will even see a Flash site duplicated with an HTML version for the search engines but the bottom line is, why bother with the Flash site at all if users don’t like them.

However although this may be (or maybe not) effective as a product demo it does nothing for the search engines. If used it should be placed on a normally optimized page and not considered as a replacement for text. Even then whether something like this is worth spending time and money on is a mute point.

The post Flash appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2lsOOiR
via IFTTT