" />

Tag Archives: AffilliateMarketing

Page Headings

Page Headings

Can we find any definitive proof online that says you need to use Heading Tags (H1, H2, H3, H4, H5, H6) or that they improve rankings in Google, and I have seen pages do well in Google without them – but I do use them, especially the H1 tag on the page. For me, it’s another piece of a ‘perfect’ page, in the traditional sense, and I try to build a site for Google and humans.

 


 

I still generally only use one heading tag in my keyword targeted pages – I believe this is the way the W3C intended it to be used in HTML4 – and I ensure they are at the top of a page above relevant page text and written with my main keywords or related keyword phrases incorporated. I have never experienced any problems using CSS to control the appearance of the heading tags making them larger or smaller.

You can use multiple H1s in HTML5, but most sites I find I work on still use HTML4. I use as many H2 – H6 as is necessary depending on the size of the page, but I use H1, H2 & H3. You can see here how to use header tags properly (basically, just be consistent, whatever you do, to give your users the best user experience).

How many words in the H1 Tag? As many as I think is sensible – as short and snappy as possible usually. I also discovered Google will use your Header tags as page titles at some level if your title element is malformed. As always be sure to make your heading tags highly relevant to the content on that page and not too spammy, either.

Alt Tags

Alt Tags are counted by Google (and Bing), but I would be careful over-optimizing them. I’ve seen a lot of websites penalised for over-optimising invisible elements on a page. Don’t do it. ALT tags are very important and I think a very rewarding area to get right. I always put the main keyword in an ALT once when addressing a page. Don’t optimise your ALT tags (or rather, attributes) JUST for Google! Use ALT tags (or rather, ALT Attributes) for descriptive text that helps visitors – and keep them unique where possible, like you do with your titles and meta descriptions.

Don’t obsess. Don’t optimise your ALT tags just for Google – do it for humans, accessibility and usability. If you are interested, I conducted a simple test using ALT attributes to determine how many words I could use in IMAGE ALT text that Google would pick up. And remember – even if, like me most days, you can’t be bothered with all the image ALT tags on your page, at least, use a blank ALT (or NULL value) so people with screen readers can enjoy your page.

About Alt Tags:

alt attribute should be used to describe the image. So if you have an image of a big blue pineapple chair you should use the alt tag that best describes it, which is alt=”big blue pineapple chair.” title attribute should be used when the image is a hyperlink to a specific page. The title attribute should contain information about what will happen when you click on the image. For example, if the image will get larger, it should read something like, title=”View a larger version of the big blue pineapple chair image.”

As the Googlebot does not see the images directly, we generally concentrate on the information provided in the “alt” attribute. Feel free to supplement the “alt” attribute with “title” and other attributes if they provide value to your users! So for example, if you have an image of a puppy (these seem popular at the moment ) playing with a ball, you could use something like “My puppy Betsy playing with a bowling ball” as the alt-attribute for the image. If you also have a link around the image, pointing a large version of the same photo, you could use “View this image in high-resolution” as the title attribute for the link.

Link Title Attributes, Acronym & ABBR Tags

Does Google Count Text in The Acronym Tag?

From my tests, no. From observing how my test page ranks – Google is ignoring keywords in the acronym tag. My observations from a test page I observe include;

  • Link Title Attribute – no benefit passed via the link either to another page, it seems
  • ABBR (Abbreviation Tags) – No
  • Image File Name – No
  • Wrapping words (or at least numbers) in SCRIPT – Sometimes. Google is better at understanding what it can render in 2016.

It’s clear many invisible elements of a page are completely ignored by Google (that would interest us SEO).
Some invisible items are (still) apparently supported:

  • NOFRAMES – Yes
  • NOSCRIPT – Yes
  • ALT Attribute – Yes

Unless you really have cause to focus on any particular invisible element, I think the **P** tag is the most important tag to optimise in 2016.

Search Engine Friendly URLs (SEF)

Clean URLs (or search engine friendly URLs) are just that – clean, easy to read, simple. You do not need clean URLs in site architecture for Google to spider a site successfully (confirmed by Google in 2008), although I do use clean URLs as a default these days, and have done so for years.

It’s often more usable. Is there a massive difference in Google when you use clean URLs?

No, in my experience it’s very much a second or third order affect, perhaps even less, if used on its own. However – there it is demonstrable benefit to having keywords in URLs. The thinking is that you might get a boost in Google SERPs if your URLs are clean – because you are using keywords in the actual page name instead of a parameter or session ID number (which Google often struggles with).

I think Google might reward the page some sort of relevance because of the actual file / page name. I optimise as if they do. It is virtually impossible to isolate any ranking factor with a degree of certainty. Where any benefit is slightly detectable is when people (say in forums) link to your site with the URL as the link.

Then it is fair to say you do get a boost because keywords are in the actual anchor text link to your site, and I believe this is the case, but again, that depends on the quality of the page linking to your site. That is, if Google trusts it and it passes Pagerank (!) and anchor text benefit. And of course, you’ll need citable content on that site of yours.

Sometimes I will remove the stopwords from a URL and leave the important keywords as the page title because a lot of forums garble a URL to shorten it. Most forums will be nofollowed in 2016, to be fair, but some old habits die-hard. Sometimes I prefer to see the exact phrase I am targeting as the name of the URL I am asking Google to rank.

It should be remembered it is thought although Googlebot can crawl sites with dynamic URLs; it is assumed by many webmasters there is a greater risk that it will give up if the URLs are deemed not important and contain multiple variables and session IDs (theory).

As standard, I use clean URLs where possible on new sites these days, and try to keep the URLs as simple as possible and do not obsess about it. That’s my aim at all times when I optimise a website to work better in Google – simplicity. Google does look at keywords in the URL even in a granular level.

Having a keyword in your URL might be the difference between your site ranking and not – potentially useful to take advantage of long tail search queries – for more see Does Google Count A Keyword In The URI (Filename) When Ranking A Page?

Absolute Or Relative URLs

My advice would be to keep it consistent whatever you decide to use. I prefer absolute URLs. That’s just a preference. Google will crawl either if the local setup is correctly developed.

  • What is an absolute URL? Example – http://www.hobo-web.co.uk/search-engine-optimisation/
  • What is a relative URL? Example – /search-engine-optimisation.htm

Relative just means relative to the document the link is on. Move that page to another site and it won’t work. With an absolute URL, it would work.

Subdirectories or Files For URL Structure

Sometimes I use subfolders and sometimes I use files. I have not been able to decide if there is any real benefit (in terms of ranking boost) to using either. A lot of CMS these days use subfolders in their file path, so I am pretty confident Google can deal with either. I used to prefer files like .html when I was building a new site from scratch, as they were the ’end of the line’ for search engines, as I imagined it, and a subfolder (or directory) was a collection of pages.

I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.

In the past, subfolders could be treated differently than files.

Subfolders can be trusted less than other subfolders or pages in your site, or ignored entirely. Subfolders *used to seem to me* to take a little longer to get indexed by Google, than for instance .html pages. People talk about trusted domains but they don’t mention (or don’t think) some parts of the domain can be trusted less. Google treats some subfolders….. differently. Well, they used to – and remembering how Google used to handle things has some benefits – even in 2016.

Some say don’t go beyond four levels of folders in your file path. I haven’t experienced too many issues, but you never know.

UPDATED – I think in 2016 it’s even less of something to worry about. There’s so much more important elements to check.

Chuck Reynolds
Contributor

The Next Big Business Buzzword

The Next Big Business Buzzword


Ecosystem

There’s a new word on the block.  And that new word – ecosystem – actually has a richer meaning than a typical business buzzword.  However, just as the word has started to get traction, it is already at risk of being dumbed down.  The thought of this meaningful word being stillborn is sad.  So I need your help.  Let’s save the word ecosystem before it’s too late!

The evidence for the growing usage of the word ecosystem is measurable.  However, to truly appreciate its rapid ascent, you have to combine the word with another word.In recent years, the use of ecosystem appears to spike seemingly out of nowhere. So it’s clear that the word ecosystem is getting used more and more.  But what does the word really mean?  And why does it matter?

Biologists say ecosystem to describe a community of organisms interacting in their environment.  We can analogize that definition to business as well.  Thus, in a business ecosystem, the relationships between things matter.  In fact, they matter a lot.  Everything touches everything else.  And so the way that humans interact has a big effect on the system.  For instance, two people sitting in a coffee shop today might be starting the world’s next most important company, or they might not.  It depends largely on how they interact.  Do they like each other?  Do inspire each other?  Do they trust each other enough to take a chance together?  The world might turn on those answers.

However, over the last couple of years, I’ve seen ecosystem used interchangeably with terms like “cluster” or “network” or “organization.”  It feels a bit like Buzzword Bingo as if folks are just swapping words out because the old ones feel tired, like clothing that’s been worn too many times.  But this is terrible because they are missing the key difference.  The terms “cluster” or “network” or “organization” are rather static.  They describe the mere presence of assets in a system, like blocks on a wall chart.

An ecosystem, on the other hand, is about the dynamic interactions between things.  It’s about how people meet, talk, trust, share, collaborate, team, experiment, and grow together.  When an ecosystem thrives, it means that the people have developed patterns of behavior – or culture – that streamline the flow of ideas, talent, and capital throughout a system.

The word ecosystem is important because its growing usage points to a profound shift in how society thinks of economic value.  It says that individuals matter.  That their actions can transform industries, even the entire world.  That little thing can make a big difference.  This is not just fuzzy talk.  It’s actually true because every great company or product started with two people sitting in a coffee shop scratching on a napkin.  Or some variation of that.

Thus, an ecosystem is more than just a word.  It’s actually a way of viewing the world.  So please join me.  Together, we can save the word ecosystem from the awful clutches of Buzzword Bingo!

Chuck Reynolds
Contributor

Business ecosystems come of age

Business ecosystems come of age

Part of the Business Trends

Businesses are moving beyond traditional industry silos and coalescing into richly networked ecosystems, creating new opportunities for innovation alongside new challenges for many incumbent enterprises.

In September 2014, the Chinese online commerce company Alibaba Group conducted its initial public offering (IPO)—the largest ever in history. This event attracted considerable media attention, some of it naturally commenting on the changing balance of the global economy and the growing impact of digitization. Largely overlooked in the commentary, however, was another important signpost to the future. In the prospectus it compiled to describe its vision, philosophy, and growth strategy, Alibaba used one word no fewer than 160 times: “ecosystem".

We’re all familiar with ecosystems in the natural world. The word was coined in the 1930s by British botanist Arthur Tansley to refer to a localized community of living organisms interacting with each other and their particular environment of air, water, mineral soil, and other elements. These organisms influence each other, and their terrain; they compete and collaborate, share and create resources, and co-evolve; and they are inevitably subject to external disruptions, to which they adapt together.3

Some view the rise of ecosystems as an opportunity for creating powerful new competitive advantage.

Noticing growing parallels, business strategist James Moore imported the concept to the increasingly dynamic and interconnected world of commerce. As he wrote in a 1993 Harvard Business Review article:

Successful businesses are those that evolve rapidly and effectively. Yet innovative businesses can’t evolve in a vacuum. They must attract resources of all sorts, drawing in capital, partners, suppliers, and customers to create cooperative networks. . . . I suggest that a company be viewed not as a member of a single industry but as part of a business ecosystem that crosses a variety of industries. In a business ecosystem, companies co-evolve capabilities around a new innovation: They work cooperatively and competitively to support new products, satisfy customer needs, and eventually incorporate the next round of innovations.

Moore’s insight was prescient—just on the cusp of the Internet era, and 15 years before the emergence of smartphones and the mobile access revolution. Initially his concept of “business ecosystems” was embraced primarily in the community that was itself creating the transformative capabilities of connection and collaboration that enabled them—the US technology sector. It continues to be critically important in that arena. Apple Inc. explicitly conceived its products and services as an ecosystem that would provide customers with a seamless experience; Facebook recognized the emphasis it had to place on deliberately building its “developer ecosystem”; some analysts no longer see technology and media competition as simply between firms, but among ecosystems of firms operating in loose alliance.

But the idea has now also taken root far beyond the US technology sector. Over the last few decades, driven largely by digital technologies and massively increased connectivity, the economy has been moving beyond narrowly defined industries built around large, vertically integrated, and mainly “self-contained” corporations. New means of creating value have been developing everywhere in the form of ever-denser and richer networks of connection, collaboration, and interdependence.

Businesses around the world are responding. Some view the rise of ecosystems as an opportunity for creating powerful new competitive advantage. For example, in July 2014, the CEO of Japan’s Softbank described its strategic intent as follows: “By providing all manner of services and content on (our) platforms, we are aiming to create a comprehensive ecosystem that other companies will never be able to rival.” A few years earlier, the CEO of Nokia similarly described a landscape of ecosystems that each encompass an array of players: “The battle of devices has now become a war of ecosystems . . .

Our competitors aren’t taking our market share with devices; they are taking our market share with an entire ecosystem. This means we’re going to have to decide how we either build, catalyze, or join an ecosystem. Others take slightly different perspectives. South Africa’s SABMiller has made a priority of “strengthening business ecosystems” in which it participates, to the benefit of local and regional economies where it operates. Some leaders have even welcomed competitors to their ecosystems. Listen to MakerBot’s newly appointed CEO, Jenny Lawton, responding to the news that Autodesk planned a bigger push into 3D printing: “Autodesk’s work and thinking is necessary to the overall industry. . .

So much of the success of the 3D ecosystem and future of 3D printing can be accelerated. And some strong ecosystems emerge without individual powerful players: For example, in China the term “shanzhai” formerly described copycat versions of electronic goods, but is now commonly referred to as the “shanzhai ecosystem”—highly collaborative arrangements across hundreds of enterprises that are accelerating entrepreneurial innovation in areas such as smartphones and the next generation of smart watches.

Especially given the diverse usage of the term, it might be tempting to dismiss “ecosystem” as yet another management buzzword in an increasingly jargon-congested business lexicon. Certainly, the term has so far defied a precise and universally agreed definition (despite valiant efforts by many academics and theorists). But the concept’s rapid spread and uptake points to a practical utility that should not be underestimated.

At a bare minimum it has provided many businesses with a powerful metaphor from the natural world. Metaphors matter: They make it easier to explore and understand abstract concepts, and inform the decision-making heuristics and mental models leaders use to make confident choices and take timely action. Our business metaphors have historically been drawn largely from machinery and engineering, warfare and the military, and competitive sports and games. These remain apt in many ways—but in increasingly dynamic, collaborative, interdependent situations, they might mislead just as much as they inform.

Ecosystems thinking provides a new frame and mindset that captures a profound shift in the economy and the business landscape. The importance of relationships, partnerships, networks, alliances, and collaborations is obviously not novel—but it is growing. As it becomes increasingly possible for firms to deploy and activate assets they neither own nor control, to engage and mobilize larger and larger numbers of participants, and to facilitate much more complex coordination of their expertise and activities, the art of the possible is expanding rapidly.

Five ways to think about ecosystems

Having noted the varied definition and broad application of the term ecosystem, it makes sense to clarify how it is being used in this document: Ecosystems are dynamic and co-evolving communities of diverse actors who create and capture new value through increasingly sophisticated models of both collaboration and competition. This definition allows for the fact that ecosystems come in a broad array of shapes, sizes, and varieties—and also captures three core characteristics that are generally present.

First, ecosystems enable and encourage the participation of a diverse range of (large and small) organizations, and often individuals, who together can create, scale, and serve markets beyond the capabilities of any single organization. This provides the requisite variety for a healthy system.

Second, participating actors interact and co-create in increasingly sophisticated ways that would historically have been hard to formally coordinate in a “top-down” manner, by deploying technologies and tools of connectivity and collaboration that are still proliferating and disseminating. This means that there is dynamism and substantial latent potential for increasingly productive ecosystem development in the years ahead.

Third, participants—often including customers—are bonded by some combination of shared interests, purpose, and values which incents to collectively nurture, sustain, and protect the ecosystem as a shared “commons.” Everyone contributes everyone benefits. This enhances the longevity and durability of ecosystems.

A distinctive characteristic of many ecosystems is that they form to achieve something together that lies beyond the effective scope and capabilities of any individual actor (or even group of broadly similar actors).

Our definition here is broadly consistent with the literature, and the thinking to date among business leaders, advisors, and academics, which continues to evolve as ecosystems become an increasingly critical unit of analysis. But there are further patterns and aspects of ecosystems that are now also coming into sharper focus as we consider the emerging opportunities and challenges for enterprises.

Ecosystems create new ways to address fundamental human needs and desires

An economy—from the most primitive to the most advanced—is essentially a system organized to meet (and often shape) human needs and desires. The major economies that arose through the course of the last two centuries developed around the best available and most ingenious means of doing so—our long-familiar industries. In the United States, these were first codified in the 1937 Standard Industrial Classification (SIC) system, which captured well the economic and business arrangements that transformed our lives for much of the 20th century. But these structures are, inevitably, changing.

Humanity did not necessarily want physicians, hospitals, and pharmaceuticals—we wanted wellness. We did not particularly crave classrooms and textbooks and teachers—we wanted to learn and achieve success. We did not demand coal mines and oil and gas extraction—we wanted energy beyond the muscles of humans and harnessed animals. In many parts of the economy today, new cross-cutting ecosystems are starting to forge new means to meet our desired ends.

Looking forward, let’s consider, for example, the automobile industry that has richly enhanced so many lives around the world. It is certainly possible to imagine the emergence of a very different ecosystem to satisfy the desire for fast, affordable, safe, and convenient personal mobility, but that might also significantly reduce the appeal of privately owned cars. Confidence could rise for “autonomous vehicles” or self-driving cars (with a technology company, Google, perhaps helping lead the way?). Carsharing might in turn, become more attractive (as cars gain the ability to deliver themselves to your door). Many car- and ride-share businesses are already experimenting, learning, and tapping into the different values of the Millennial generation. For some cities, according to former General Motors R&D chief Lawrence Burns, “about 80 percent fewer shared, coordinated vehicles would be needed than personally owned vehicles to provide the same level of mobility, with less investment.” While such dramatic change is certainly not inevitable, it is plausible that new “mobility ecosystems” might coalesce around the automobile industry, and include city planners, technology and energy players, public transportation providers, regulators, infrastructure and construction players, insurance companies, and peer-to-peer networks—collaborating, adapting, and responding to one another’s moves, and once again transforming and improving our lives.

Ecosystems drive new collaborations to address rising social and environmental challenges

A distinctive characteristic of many ecosystems is that they form to achieve something together that lies beyond the effective scope and capabilities of any individual actor (or even group of broadly similar actors). In some instances, these relate to large societal problems that no individual organization is ableagement, child poverty, inner-city violence and gun crime, and food safety. All are obviously critical and—in some areas at least—are sources of growing pressure or threat.

Take as an example the Global Food Safety Initiative (GFSI), a non-profit organization whose members include many of the world’s largest food producers, distributors, and retailers. It helps coordinate a global, co-creative, and collaborative approach to addressing the growing challenge in a global food system of ensuring safety for consumers and protecting the reputation of the industry. Some of its members compete ferociously in their markets, but also collaborate aggressively to ensure the certification, shared standards, superior monitoring, and shared learning and leading practices that together create a safer food industry and boost consumer confidence. Here is new ecosystem-oriented behavior in which every participant benefits from their collective investment in the shared “commons”—and has acknowledged that, in the words of GFSI: “Food safety is not a competitive advantage.”

Ecosystems create and serve communities, and harness their creativity and intelligence

Multiple, and on the surface highly diverse, disciplines that examine the human condition—from anthropological and archeological studies of ancient “wisdom” cultures, through theology and philosophy, to today’s behavioral economics and even neuroscience—converge around a few key fundamentals. People want to belong, to understand and be understood, to achieve acknowledged competence in their chosen arena, and to make a positive difference in their world. Historically, few people could fully realize these desires beyond their own immediate and tightly constrained physical domains. Today, technology has transformed the ways and levels in which such self-actualization can occur—and many ecosystems are now benefiting from this vital shift.

The most obvious illustrations are the many business ecosystems that have been designed specifically to enable us to find and connect with our own “tribes”—those that surround businesses like Facebook, Twitter, and Yelp. Recognizing the importance of its top users (the site’s most prolific reviewer has written more than 8,000 reviews), Yelp founded its Elite program to recognize and reward its community of regular reviewers with exclusive parties and freebies.

But also consider three other exemplary arenas. Online gaming is today a $20 billion growing business, and many of the most successful games not only connect people around the globe but actively engage them in the continuous development of the games themselves. The open source movement, which originally attracted extraordinary and often unpaid—contributions from hundreds of thousands of highly skilled individuals in the software environment, has been spreading across the economy. Other examples can now be found across diverse industries and sectors. In media, Blender’s free and open source 3D computer graphics program has been used to generate outputs as diverse as 3D models of NASA space crafts and storyboards for Spider-Man 2. In education, the Massachusetts Institute of Technology’s OpenCourseWare provides digital access to “virtually all MIT course content.”

And, for solving more specific (and sometimes also time-bound) problems, there has been a substantial rise recently in “crowd-sourcing.” Organizations like Kaggle host competitions that invite participants to use data science and algorithms to solve business problems. Others, like XPRIZE, organize grand challenges that encourage players to collaborate to tackle complex social and environmental issues. The results already speak for themselves—examples include a device that skims oil off water three times faster than previously existing technology, and software that is able to show trends in symptoms of Parkinson’s disease in individual patients over time.

Today, almost every business can find ways to benefit from this growing and global opportunity to forge, serve, and grow alongside—and with the help of—new communities, which will often include customers who were traditionally regarded as passive recipients rather than active participants. The LEGO Group, for instance, has found new ways to connect with customers young and old with its LEGO® Ideas portal, on which fans have enthusiastically submitted and supported ideas including the Minecraft and Ghostbusters 30th Anniversary toy sets. Companies that are able to tap into the resourcefulness of their ecosystems will not only discover new points of resonance with their customers, but are also opening themselves to a universe of opportunity, just as The LEGO Group did when it found inspiration for its blockbuster The LEGO Movie™ from a collection of stop-motion films produced utilizing LEGO bricks on YouTube.

Ecosystems often exist on top of powerful new business platforms

A “platform” is a powerful type of ecosystem, typically created and owned by a single business or entity, but deliberately designed to attract the active participation of large numbers of other actors. According to scholar Yochai Benkler, it is “a technical and organizational context in which a community can interact to achieve a specific purpose.” Some are designed primarily to create new markets by enabling connections between previously separated potential buyers and sellers; others are more focused on the distributed development of new products, services, and solutions. An early illustrative example combined both. In 1968, Dee Hock worked in a local bank in Washington State and spotted a problem and an opportunity in the early days of consumer credit cards. Many banks were attempting to issue their own proprietary product, each of them encountering the laborious burden of signing up merchants to accept them, persuading customers of their utility and security, reassuring regulators, and designing protocols and features for the new product. By proposing a shared platform to deal with all these arduous tasks—which became VISA in 1976—he enabled them to pool resources and to both collaborate and compete within a much simpler-to-develop, and hence much faster-growing, financial market for credit.

VISA may have been an early example, but it has since been joined by many other platforms, spurred by digitization and connective technologies. eBay created a global auction-based marketplace that now connects millions of buyers and sellers. More recently, a variety of new “sharing economy” platform businesses have established entirely new ecosystems that enable vast numbers of participants to share access to their previously idle or under-utilized assets, creating significant social and economic value in the process. Some, such as Airbnb and Uber, have disrupted incumbent industries—and more will likely do so in future, in additional domains of the economy.

Meanwhile, other platforms have emerged to accelerate and distribute the development of new products and services. An early example was the success of open source models that transformed the software sector by inviting vast numbers of programmers to develop products such as Linux. This established the pathway to the explosive, widely distributed development of new applications on enabling platforms created by Apple, Facebook, Google, Samsung, Salesforce, and others. In recent years literally millions of apps have been created, producing new solutions and possibilities for consumers and enterprises alike.

The results have been spectacular for some platform creators. One estimate suggests that four of the top five most valuable global brands are largely based on platform business models. With many of the world’s fastest-growing, highest-profile new companies joining them, there is no sign of the phenomenon slowing.

Ecosystems accelerate learning and innovation

Philosopher Eric Hoffer observed that “In times of change learners inherit the earth, while the learned find themselves beautifully equipped to deal with a world that no longer exists.” The imperatives for businesses to learn and to translate learning into innovation have never been greater. And, as many corporate leaders have recognized, the smartest people can’t all work for just one organization; this means, importantly, that they don’t all work for yours.

Ecosystems provide businesses access to sharp minds and smart resources, whether they are located with suppliers, customers, research organizations, or independently. For example, users of InnoCentive connect with an ecosystem of thousands of innovators and problem-solvers around the world. A Telstra executive once said he seeks out partners who will push new thinking: “When we look to partner, we look for . . . innovation . . . what you’re looking for is someone who’s going to challenge you. I don’t want you to tell me I’m good. I want you to tell me what I have to do differently, how I can be different.” Learning is a largely social activity; innovation is very often the result of integration and connection across different fields of expertise and domains of knowledge; and both are therefore accelerated in the fluid, exchange-oriented, and co-creative communities that are forged by ecosystems. Some observers, notably John Hagel, have suggested that such ecosystems will prove the most enduring and influential, and provide the most sustained and important benefits to those businesses that create, lead, and participate in them.

For example, the Mahindra Group, one of India’s largest corporations with more than 200,000 employees globally and an enormous supplier network, was recently celebrated for linking suppliers and internal businesses alike in jointly owned initiatives to “accept no limits, drive positive change, and promote alternative thinking.” The resulting ecosystems of collaboration have benefited Mahindra itself by energizing and aligning learning and creativity across the diversified group. Just as importantly, however, Mahindra credits efforts like this as promoting widespread transformation across entire geographies where Mahindra’s operations are centered, like Maharashtra, India’s second most populous state and its largest contributor to GDP by far. The dynamism and productivity of such local hubs have given rise around the world to a growing focus on local and regional “start-up ecosystems” and “innovation ecosystems”—a trend actively encouraged in November 2014 by a number of senior European business leaders in an open letter to the European Union.

The world is entering an era in which ideas and insights come from everywhere, and crowds, clouds, collaborators, competitions, and co-creators can fundamentally help define our shared future. The business environment is being permanently altered as a result.

Managing in a world of business ecosystems

The rise of business ecosystems is fundamentally altering the key success factors for leading organizations, forcing them to think and act very differently regarding their strategies, business models, leadership, core capabilities, value creation and capture systems, and organizational models. More will be learned over time, as ecosystems continue to reveal their secrets. This ongoing process is not surprising. After all, it was only in the late 1930s that we created standardized classifications for the distinct sectors of the industrial economy and then started to track their collective output with a measure called GDP. It took almost another 30 years to hammer out the detailed, if still evolving, standards for many business professions, and almost 20 years more for the basic tools of “strategy” to be revealed.

In this trends report, however, we will take a deeper dive into what is already clearly discernible as business ecosystems come of age—and can therefore be applied to business strategy and operations today. Specifically, we will explore the following trends and the associated ways in which future-shaping leaders are:

  • Transcending historical constraints as multiple boundaries blur and dissolve simultaneously, to create new value and redefine the “art of the possible.”
  • Participating in evolving ecosystems that forge alliances to address major pressing societal challenges through new solutions, generating both profits and social value at the same time.
  • Engaging with the domains of regulation and policy to maintain an effective balance between protecting the public’s interest and enabling the new markets and solutions which fast-evolving ecosystems make possible.
  • Reimagining existing supply chains as “value webs” that enjoy greater autonomy and trust, learn and innovate together, and forge the sustainable models for success that benefit all those involved.
  • Reconfiguring assets for a more relationship- and collaboration-based economy in which ownership and control matter less, and activating the assets of others matter more, altering M&A strategies in the process.
  • Creating new enterprise platforms that enable the entrepreneurship, and help liberate and harness the talents, of countless other participants.
  • Learning to transform businesses and organizations without destroying them, by taking a lesson from the entrepreneurs’ use of minimum viable products.
  • Embracing new core competencies—especially the skills embedded in the world of design—and reinventing their management thinking and practices.
  •  

Chuck Reynolds
Contributor

Here’s a little about Meta Keywords Tags

Here's a little about Meta Keywords Tags

A hallmark of shady natural search engine optimisation companies – the meta-keywords tag. Companies that waste time and resources on these items waste client’s money – that’s a fact:

If you are relying on meta keyword optimisation to rank for terms, you are dead in the water. From what I see, Google + Bing ignores meta keywords – or, at least, placmeta keyword in them to rank pages. Yahoo may read them, but really, a search engine optimiser has more important things to worry about than this nonsense.

What about other search engines that use them? Hang on while I submit my site to those 75,000 engines first [sarcasm!]. Yes, ten years ago early search engines liked looking at your meta-keywords. I’ve seen OPs in forums ponder which is the best way to write these tags – with commas, with spaces, limiting to how many characters. Forget about meta-keyword tags – they are a pointless waste of time and bandwidth. Could probably save a rain forest with the bandwidth costs we save if everybody removed their keyword tags.

Tin Foil Hat Time

So you have a new site. You fill your home page meta tags with the 20 keywords you want to rank for – hey, that’s what optimisation is all about, isn’t it? You’ve just told Google by the third line of text what to filter you for. The meta name=”Keywords” was actually originally for words that weren’t actually on the page that would help classify the document.

Sometimes competitors might use the information in your keywords to determine what you are trying to rank for, too. If everybody removed them and stopped abusing meta keywords, Google would probably start looking at them but that’s the way of things in search engines. I ignore meta keywords and remove them from pages I work on.

Meta Description Tag

Like the title element and unlike the meta keywords tag, this one is important, both from a human and search engine perspective. Forget whether or not to put your keyword in it, make it relevant to a searcher and write it for humans, not search engines. If you want to have this 20-word snippet which accurately describes the page you have optimised for one or two keyword phrases when people use Google to search, make sure the keyword is in there.

I must say, I normally do include the keyword in the description as this usually gets it in your SERP snippet. Google looks at the description but there is debate whether it uses the description tag to rank sites. I think they might be at some level, but again, a very weak signal. I certainly don’t know of an example that clearly shows a meta description helping a page rank.

Sometimes, I will ask a question with my titles, and answer it in the description, sometimes I will just give a hint. That is a lot more difficult in 2016 as search snippets change depending on what Google wants to emphasise to its users. It’s also very important to have unique meta descriptions on every page on your site.

Tin Foil Hat Time

Sometimes I think if your titles are spammy, your keywords are spammy, and your meta description is spammy, Google might stop right there – even they probably will want to save bandwidth at some time. Putting a keyword in the description won’t take a crap site to number 1 or raise you 50 spots in a competitive niche – so why optimise for a search engine when you can optimise for a human? – I think that is much more valuable, especially if you are in the mix already – that is – on page one for your keyword.

So, the meta description tag is important in Google, Yahoo and Bing and every other engine listing – very important to get it right.

Make it for humans.

Oh, and by the way – Google seems to truncate anything over @156 characters in the meta description, although this may be limited by pixel width in 2016.

Programmatically Generate Meta Descriptions on Large Sites

Googles says you can programmatically auto-generate unique meta descriptions based on the content of the page.

Follow Googles example:

<META NAME="Description" CONTENT="Author: J. K. Rowling, Illustrator: Mary GrandPré, Category: Books, Price: $17.99, Length: 784 pages">

….and their advice why to do this:

No duplication, more information, and everything is clearly tagged and separated. No real additional work is required to generate something of this quality: the price and length are the only new data, and they are already displayed on the site.

I think it is very important to listen when Google tells you to do something in a very specific way, and Google does give clear advice in this area.

 

Robots Meta Tag

I could use the above meta tag to tell Google to index the page but not to follow any links on the page, if for some reason, I did not want the page to appear in Google search results.

 

By default, Googlebot will index a page and follow links to it. So there’s no need to tag pages with content values of INDEX or FOLLOW. GOOGLE

There are various instructions you can make use of in your Robots Meta Tag, but remember Google by default WILL index and follow links, so you have NO need to include that as a command – you can leave the robots meta out completely – and probably should if you don’t have a clue.

Googlebot understands any combination of lowercase and uppercase. GOOGLE.

Valid values for Robots Meta Tag ”CONTENT” attribute are: “INDEX“, “NOINDEX“, “FOLLOW“, and  “NOFOLLOW“.

Example Usage:

  • META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”
  • META NAME=”ROBOTS” CONTENT=”INDEX, NOFOLLOW”
  • META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”
  • META NAME=”ROBOTS” CONTENT=”NOARCHIVE”
  • META NAME=”GOOGLEBOT” CONTENT=”NOSNIPPET”

Google will understand the following and interprets the following robots meta tag values:

  • NOINDEX – prevents the page from being included in the index.
  • NOFOLLOW – prevents Googlebot from following any links on the page. (Note that this is different from the link-level NOFOLLOW attribute, which prevents Googlebot from following an individual link.)
  • NOARCHIVE – prevents a cached copy of this page from being available in the search results.
  • NOSNIPPET – prevents a description from appearing below the page in the search results, as well as prevents caching of the page.
  • NOODP – blocks the Open Directory Project description of the page from being used in the description that appears below the page in the search results.
  • NONE – equivalent to “NOINDEX, NOFOLLOW”.

Robots META Tag Quick Reference

Terms Googlebot Slurp BingBot Teoma
NoIndex YES YES YES YES
NoFollow YES YES YES YES
NoArchive YES YES YES YES
NoSnippet YES NO NO NO
NoODP YES YES YES NO
NoYDIR NO YES NO NO
NoImageIndex YES NO NO NO
NoTranslate YES NO NO NO
Unavailable_After YES NO NO NO

I’ve included the robots meta tag in my tutorial as this IS one of only a few meta tags / HTML head elements I focus on when it comes to managing Googlebot and Bingbot. At a page level – it is a powerful way to control if your pages are returned in search results pages.

These meta tags go in the [HEAD] section of a [HTML] page and represent the only tags for Google I care about. Just about everything else you can put in the [HEAD] of your HTML document is quite unnecessary and maybe even pointless (for Google optimisation, anyway).

Chuck Reynolds
Contributor

 

Do the Page Title Element Matter.?

Do the Page Title Element Matter.?


 

The page title tag (or HTML Title Element) is arguably the most important on-page ranking factor (with regards to web page optimisation). Keywords in page titles can undeniably HELP your pages rank higher in Google results pages (SERPs). The page title is also often used by Google as the title of a search snippet link in search engine results pages.

A perfect title tag in Google is dependant on a number of factors and I will lay down a couple below but I have since expanded page title advice on another page (link below);

  1. A page title that is highly relevant to the page it refers to will maximise its usability, search engine ranking performance and click through satisfaction rate. It will probably be displayed in a web browser’s window title bar, and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring, AT LEAST, ONCE within it.
  2. Most modern search engines have traditionally placed a lot of importance in the words contained within this HTML element. A good page title is made up of valuable keyword phrases with clear user intent.
  3. The last time I looked Google displayed as many characters as it can fit into  “a block element that’s 512px wide and doesn’t exceed 1 line of text”. So – THERE BECAME NO AMOUNT OF CHARACTERS any optimiser could lay down as exact best practice to GUARANTEE a title will display, in full in Google, at least, as the search snippet title. Ultimately – only the characters and words you use will determine if your entire page title will be seen in a Google search snippet. Recently Google displayed 70 characters in a title – but that changed in 2011/2012.
  4. If you want to ENSURE your FULL title tag shows in the desktop UK version of Google SERPs, stick to a shorter title of about 55 characters but that does not mean your title tag MUST end at 55 characters and remember your mobile visitors see a longer title (in the UK, in March 2015 at least). I have seen ‘up-to’ 69 characters (back in 2012) – but as I said – what you see displayed in SERPs depends on the characters you use. In 2016 – I just expect what Google displays to change – so I don’t obsess about what Google is doing in terms of display.
  5. Google is all about ‘user experience’ and ‘visitor satisfaction’ in 2016 so it’s worth remembering that usability studies have shown that a good page title length is about seven or eight words long and fewer than 64 total characters. Longer titles are less scan able in bookmark lists, and might not display correctly in many browsers (and of course probably will be truncated in SERPs).
  6. Google will INDEX perhaps 1000s of characters in a title… but I don’t think no one knows exactly how many characters or words Google will count AS a TITLE when determining relevance for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide its ‘secret sauce‘. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).
  7. You can probably include up to 12 words that will be counted as part of a page title, and consider using your important keywords in the first eight words. The rest of your page title will be counted as normal text on the page.
  8. NOTE, in 2016, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description are very much QUERY dependant these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
  9. When optimising a title, you are looking to rank for as many terms as possible, without keyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) – and take a more long-tail approach. Note that too many page titles and not enough actual page text per page could lead to Google Panda or other ‘user experience’ performance issues. A highly relevant unique page title is no longer enough to float a page with thin content. Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites.
  10. Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. Remember this is your hook in search engines if Google chooses to use your page title in its search snippet, and there are a lot of competing pages out there in 2016.
  11. The perfect title tag on a page is unique to other pages on the site. In light of Google Panda, an algorithm that looks for a ‘quality’ in sites, you REALLY need to make your page titles UNIQUE, and minimise any duplication, especially on larger sites.
  12. I like to make sure my keywords feature as early as possible in a title tag but the important thing is to have important keywords and key phrases in your page title tag SOMEWHERE.
  13. For me, when improved search engine visibility is more important than branding, the company name goes at the end of the tag, and I use a variety of dividers to separate as no one way performs best. If you have a recognisable brand – then there is an argument for putting this at the front of titles – although Google often will change your title dynamically – sometimes putting your brand at the front of your snippet link title itself.
  14. Note that Google is pretty good these days at removing any special characters you have in your page title – and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up.
  15. I like to think I write titles for search engines AND humans.
  16. Know that Google tweaks everything regularly – why not what the perfect title keys off? So MIX it up…
  17. Don’t obsess. Natural is probably better, and will only get better as engines evolve. I optimise for key-phrases, rather than just keywords.
  18. I prefer mixed case page titles as I find them more scan able than titles with ALL CAPS or all lowercase.
  19. Generally speaking, the more domain trust/authority your SITE has in Google, the easier it is for a new page to rank for something. So bear that in mind. There is only so much you can do with your page titles – your websites rankings in Google are a LOT more to do with OFFSITE factors than ONSITE ones – negative and positive.
  20. Click through rate is something that is likely measured by Google when ranking pages (Bing say they use it too, and they now power Yahoo), so it is worth considering whether you are best optimising your page titles for click-through rate or optimising for more search engine rankings.
  21. I would imagine keyword stuffing your page titles could be one area Google look at (although I see little evidence of it).
  22. Remember….think ‘keyword phrase‘ rather than ‘keyword‘, ‘keyword‘,’keyword‘… think Long Tail.
  23. Google will select the best title it wants for your search snippet – and it will take that information from multiple sources, NOT just your page title element. A small title is often appended with more information about the domain. Sometimes, if Google is confident in the BRAND name, it will replace it with that (often adding it to the beginning of your title with a colon, or sometimes appending the end of your snippet title with the actual domain address the page belongs to).

A Note About Title Tags;

When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.

Always aim to keep your HTML page title elements simple and as unique as possible.

You should certainly be cleaning up the way you write your titles all the time if necessary.

Chuck Reynolds
Contributor

 

 

 

What is The Perfect Keyword Density?

What is The Perfect Keyword Density?
 

The short answer to this is – no. There is no one-size-fits-all keyword density, no optimal percentage guaranteed to rank any page at number 1. However, I do know you can keyword stuff a page and trip a spam filter.

Most web optimisation professionals agree there is no ideal percent of keywords in a text to get a page to number 1 in Google. Search engines are not that easy to fool, although the key to success in many fields doing simple things well (or, at least, better than the competition).

I write natural page copy where possible always focused on the key terms – I never calculate density to identify the best % – there are way too many other things to work on. I have looked into this. If it looks natural, it’s ok with me. I aim to include related terms, long-tail variants and synonyms in Primary Content – at least ONCE, as that is all some pages need. Optimal keyword density is a myth, although there are many who would argue otherwise.

‘Things, Not Strings’

Google is better at working out what a page is about, and what it should be about to satisfy the intent of a searcher, and it isn’t relying only on keyword phrases on a page to do that anymore.

Google has a Knowledge Graph populated with NAMED ENTITIES and in certain circumstances, Google relies on such information to create SERPs (Search Engine Results Pages)..

Google has plenty of options when rewriting the query in a contextual way, based on what you searched for previously, who you are, how you searched and where you are at the time of the search.

Can I Just Write Naturally and Rank High in Google?

Yes, you must write naturally (and succinctly) in 2016, but if you have no idea the keywords you are targeting, and no expertise in the topic, you will be left behind those that can access this experience.

You can just ‘write naturally’ and still rank, albeit for fewer keywords than you would have if you optimised the page.

There are too many competing pages targeting the top spots not to optimise your content.

Naturally, how much text you need to write, how much you need to work into it, and where you ultimately rank, is going to depend on the domain reputation of the site you are publishing the article on.

Do You Need Lots of Text To Rank Pages In Google?

User search intent is a way marketers describe what a user wants to accomplish when they perform a Google search.

SEOs have understood user search intent to fall broadly into the following categories and there is an excellent post on Moz about this.

  1. Transactional – The user wants to do something like buy, signup, register to complete a task they have in mind.
  2. Informational – The user wishes to learn something
  3. Navigational – The user knows where they are going

The Google human quality rater guidelines modify these to simpler constructs:

  • Do 
  • Know
  • Go

As long as you meet the user’s primary intent, you can do this with as few words as it takes to do so.

You do NOT need lots of text to rank in Google.

Optimise For User Intent & Satisfaction

When it comes to writing SEO-friendly text for Google, we must optimise for user intent, not simply what a user typed into Google.

Google will send people looking for information on a topic to the highest quality, relevant pages it has in its database, often BEFORE it relies on how Google ‘used‘ to work e.g. relying on finding near or exact match instances of a keyword phrase on any one page.

Google is constantly evolving to better understand the context and intent of user behaviour, and it doesn’t mind rewriting the query used to serve high-quality pages to users that comprehensively deliver on user satisfaction e.g. explore topics and concepts in a unique and satisfying way.

Of course, optimising for user intent, even in this fashion, is something a lot of marketers had been doing long before query rewriting and  Google Hummingbird came along.

Optimising For ‘The Long Click’

When it comes to rating user satisfaction, there are a few theories doing the rounds at the moment that I think are sensible. Google could be tracking user satisfaction by proxy. When a user uses Google to search for something, user behaviour from that point on can be a proxy of the relevance and relative quality of the actual SERP.

What is a Long Click?

A user clicks a result and spends time on it, sometimes terminating the search.

What is a Short Click?

A user clicks a result and bounces back to the SERP, pogo-sticking between other results until a long click is observed. Google has this information if it wants to use it as a proxy for query satisfaction.

For more on this, I recommend this article on the time to long click.

Optimise Supplementary Content on the Page

Once you have the content, you need to think about supplementary content and secondary links that help users on their journey of discovery.

That content CAN be on links to your own content on other pages, but if you are really helping a user understand a topic – you should be LINKING OUT to other helpful resources e.g. other websites.A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.

A website that does not link out to ANY other website could be interpreted accurately to be at least, self-serving. I can’t think of a website that is the true end-point of the web.

  • TASK – On informational pages, LINK OUT to related pages on other sites AND on other pages on your own website where RELEVANT
  • TASK – For e-commerce pages, ADD RELATED PRODUCTS.
  • TASK – Create In-depth Content Pieces
  • TASK – Keep Content Up to Date, Minimise Ads, Maximise Conversion, Monitor For broken, or redirected links
  • TASK – Assign in-depth content to an author with some online authority, or someone with displayable expertise on the subject
  • TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.

Chuck Reynolds
Contributor

 

Keywords In Bold Or Italic?

Do Keywords In Bold Or Italic Help?

Some webmasters claim putting your keywords in bold or putting your keywords in italics is a beneficial ranking factor in terms of search engine optimising a page.

It is essentially impossible to test this, and I think these days, Google could well be using this (and other easy to identify on page optimisation efforts) to determine what to punish a site for, not promote it in SERPs.

Any item you can ‘optimise’ on your page – Google can use this against you to filter you out of results.

I use bold or italics these days specifically for users.

I only use emphasis if it’s natural or this is really what I want to emphasise!

Do not tell Google what to filter you for that easily.

I think Google treats websites they trust far different to others in some respect.

That is, more trusted sites might get treated differently than untrusted sites.

Keep it simple, natural, useful and random.

How Many Words & Keywords Should be Used On A Page?

I get asked this all the time –

how much text do you put on a page to rank for a certain keyword?

The answer is there is no optimal amount of text per page, but how much text you’ll ‘need’ will be based on your DOMAIN AUTHORITY, your TOPICAL RELEVANCE and how much COMPETITION there is for that term and HOW COMPETITIVE that competition is. Instead of thinking about the quantity of the text, you should think more about the quality of the content on the page. Optimise this with searcher intent in mind. Well, that’s how I do it.

I don’t find that you need a minimum amount of words or text to rank in Google. I have seen pages with 50 words outrank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other ‘strategy’. In 2016, Google is a lot better at hiding away those pages, though.

At the moment, I prefer long form pages with a lot of text although I still rely heavily on keyword analysis to make my pages. The benefits of longer pages are that they are great for long tail key phrases. Creating deep, information rich pages focuses the mind when it comes to producing authoritative, useful content.

Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a user’s search query. I don’t care how many words I achieve this with and often I need to experiment on a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google.

One thing to note – the more text you add to the page, as long as it is unique, keyword rich and relevant, the more that page will be rewarded with more visitors from Google. There is no optimal number of words on a page for placement in Google. Every website – every page – is different from what I can see. Don’t worry too much about word count if your content is original and informative. Google will probably reward you on some level – at some point – if there is lots of unique text on all your pages.

Chuck Reynolds
Contributor

What is Keyword Research.?

What is Keyword Research.?

The first step in any professional campaign is to do some keyword research and analysis.

Screen Shot 2013-07-19 at 02.13.14Somebody asked me about this a simple white hat tactic and I think what is probably the simplest thing anyone can do that guarantees results.

The chart above (from last year) illustrates a reasonably valuable 4-word term I noticed a page I hadn’t ranked high in Google for, but I thought probably should and could rank for, with this simple technique.

I thought it as simple as an example to illustrate an aspect of on page SEO or ‘rank modification’, that’s white hat, 100% Google friendly and never, ever going to cause you a problem with Google. This ‘trick’ works with any keyword phrase, on any site, with obvious differing results based on the availability of competing pages in SERPs, and availability of content on your site.

The keyword phrase I am testing rankings for isn’t ON the page, and I did NOT add the key phrase…. or in incoming links, or using any technical tricks like redirects or any hidden technique, but as you can see from the chart, rankings seem to be going in the right direction.

You can profit from it if you know a little about how Google works (or seems to work, in many observations, over years, excluding when Google throws you a bone on synonyms. You can’t ever be 100% certain you know how Google works on any level, unless it’s data showing you’re wrong, of course.)

What did I do to rank number 1 from nowhere for that key phrase?

I added one keyword to the page in plain text because adding the actual ‘keyword phrase’ itself would have made my text read a bit keyword stuffed for other variations of the main term. It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research – and knowing which unique keywords to add.

This example illustrates a key to ‘relevance’ on a page, in a lot of instances, is a keyword.

The precise keyword.

Yes – plenty of other things can be happening at the same time. It’s hard to identify EXACTLY why Google ranks pages all the time…but you can COUNT on other things happening and just get on with what you can see works for you.

In a time of light optimisation, it’s useful to EARN a few terms you SHOULD rank for in simple ways that leave others wondering how you got it.

Of course, you can still keyword stuff a page, or still spam your link profile – but it is ‘light’ optimisation I am genuinely interested in testing on this site – how to get more with less – I think that’s the key to not tripping Google’s aggressive algorithms.

There are many tools on the web to help with basic keyword research (including the Google Keyword Planner tool and there are even more useful third party SEO tools to help you do this).

You can use many keyword research tools to identify quickly opportunities to get more traffic to a page.

I built my own:

Google Analytics Keyword ‘Not Provided.’

Google Analytics was the very best place to look at keyword opportunity for some (especially older) sites, but that all changed a few years back.

Google stopped telling us which keywords are sending traffic to our sites from the search engine back in October 2011, as part of privacy concerns for its users.

Google will now begin encrypting searches that people do by default, if they are logged into Google.com already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive “referrer” data that reveals what those people searched for, except in the case of ads.

Google Analytics now instead displays – keyword “not provided“, instead.

In Google’s new system, referrer data will be blocked. This means site owners will begin to lose valuable data that they depend on, to understand how their sites are found through Google. They’ll still be able to tell that someone came from a Google search. They won’t, however, know what that search was. SearchEngineLand (A great source for Google industry news)

You can still get some of this data if you sign up for Google Webmaster Tools (and you can combine this in Google Analytics) but the data even there is limited and often not entirely the most accurate. The keyword data can be useful, though – and access to backlink data is essential these days.

If the website you are working on is an aged site – there’s probably a wealth of keyword data in Google Analytics:

Chuck Reynolds
Contributor

 

What about SEO Ranking Factors?

What about SEO Ranking Factors?

Google has HUNDREDS of ranking factors with signals that can change daily, weekly, monthly or yearly to help it work out where your page ranks in comparison to other competing pages in SERPs.

You will not ever find every ranking factor. Many ranking factors are on-page or on-site and others are off-page or off-site. Some ranking factors are based on where you are, or what you have searched for before. I’ve been in online marketing for 15 years. In that time, a lot has changed. I’ve learned to focus on aspects that offer the greatest return on investment of your labour.

Learn SEO Basics….

Here are few simple SEO tips to begin with:

  • If you are just starting out, don’t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it’s best to keep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out – you may as well learn how to do it within Google’s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google’s guidelines, or not, and stick to it. Don’t be caught in the middle with an important project. Do not always follow the herd.
  • If your aim is to deceive visitors from Google, in any way, Google is not your friend. Google is hardly your friend at any rate – but you don’t want it as your enemy. Google will send you lots of free traffic though if you manage to get to the top of search results, so perhaps they are not all that bad.
  • A lot of optimisation techniques that are effective in boosting sites rankings in Google are against Google’s guidelines. For example, many links that may have once promoted you to the top of Google, may, in fact, today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back…. You must be smart, and cautious, when it comes to building links to your site in a manner that Google *hopefully* won’t have too much trouble with, in the FUTURE. Because they will punish you in the future.
  • Don’t expect to rank number 1 in any niche for a competitive without a lot of investment, work. Don’t expect results overnight. Expecting too much too fast might get you in trouble with the spam team.
  • You don’t pay anything to get into Google, Yahoo or Bing natural, or free listings. It’s common for the major search engines to find your website pretty quickly by themselves within a few days. This is made so much easier if your cms actually ‘pings’ search engines when you update content (via XML sitemaps or RSS for instance).
  • To be listed and rank high in Google and other search engines, you really should consider and mostly abide by search engine rules and official guidelines for inclusion. With experience and a lot of observation, you can learn which rules can be bent, and which tactics are short term and perhaps, should be avoided.
  • Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)
  • I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.
  • If you have original, quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you have original content of sufficient quality on your site, you can then let authority websites – those with online business authority – know about it, and they might link to you – this is called a quality backlink.
  • Search engines need to understand that ‘a link is a link’ that can be trusted. Links can be designed to be ignored by search engines with the rel Nofollow attribute.
  • Search engines can also find your site by other websites linking to it. You can also submit your site to search engines direct, but I haven’t submitted any site to a search engine in the last ten years – you probably don’t need to do that. If you have a new site, I would immediately register it with Google Webmaster Tools these days.
  • Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
  • Many think Google will not allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, no doubt.
  • Google WILL classify your site when it crawls and indexes your site – and this classification can have a DRASTIC effect on your rankings – it’s important for Google to work out WHAT YOUR ULTIMATE INTENT IS – do you want to be classified as an affiliate site made ‘just for Google’, a domain holding page or a small business website with a real purpose? Ensure you don’t confuse Google by being explicit with all the signals you can – to show on your website you are a real business, and your INTENT is genuine – and even more importantly today – FOCUSED ON SATISFYING A VISITOR.
  • NOTE – If a page exists only to make money from Google’s free traffic – Google calls this spam. I go into this more, later in this guide.
  • The transparency you provide on your website in text and links about who you are, what you do, and how you’re rated on the web or as a business is one way that Google could use (algorithmically and manually) to ‘rate’ your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.
  • To rank for specific keyword phrase searches, you usually need to have the keyword phrase or highly relevant words on your page (not necessarily all together, but it helps) or in links pointing to your page/site.
  • Ultimately what you need to do to compete is largely dependent on what the competition for the term you are targeting is doing. You’ll need to at least mirror how hard they are competing if a better opportunity is hard to spot.
  • As a result of other quality sites linking to your site, the site now has a certain amount of real PageRank that is shared with all the internal pages that make up your website that will in future help provide a signal to where this page ranks in the future.
  • Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Webmaster Tools if you sign up.
  • Link building is not JUST a numbers game, though. One link from a “trusted authority” site in Google could be all you need to rank high in your niche. Of course, the more “trusted” links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google in 2016.
  • Try and get links within page text pointing to your site with relevant, or at least, natural looking, keywords in the text link – not, for instance, in blogrolls or site-wide links. Try to ensure the links are not obviously “machine generated” e.g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits.
  • Onsite, consider linking to your other pages by linking to pages within the main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.
  • Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “SEO Scotland” as opposed to http://www.hobo-web.co.uk or “click here“. Saying that – in 2016, Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially with links).
  • I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.
  • Search engines like Google ‘spider’ or ‘crawl’ your entire site by following all the links on your site to new pages, much as a human would click on the links to your pages. Google will crawl and index your pages, and within a few days usually, begin to return your pages in SERPs.
  • After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.
  • Ideally, you will have unique pages, with unique page titles and unique page meta descriptions . Google does not seem to use the meta description when ranking your page for specific keyword searches if not relevant and unless you are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main content on their site. I don’t worry about meta keywords these days as Google and Bing say they either ignore them or use them as spam signals.
  • Google will take some time to analyse your entire site, examining text content and links. This process is taking longer and longer these days but is ultimately determined by your domain reputation and real PageRank.
  • If you have a lot of duplicate low-quality text already found by Googlebot on other websites it knows about; Google will ignore your page. If your site or page has spammy signals, Google will penalise it, sooner or later. If you have lots of these pages on your site – Google will ignore most of your website.
  • You don’t need to keyword stuff your text to beat the competition.
  • You optimise a page for more traffic by increasing the frequency of the desired key phrase, related key terms, co-occurring keywords and synonyms in links, page titles and text content. There is no ideal amount of text – no magic keyword density. Keyword stuffing is a tricky business, too, these days.
  • I prefer to make sure I have as many UNIQUE relevant words on the page that make up as many relevant long tail queries as possible.
  • If you link out to irrelevant sites, Google may ignore the page, too – but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER – I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites, for example, don’t do well in Google these days without some good quality backlinks and higher quality pages.
  • Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out.
  • I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2016.
  • Original content is king and will attract a “natural link growth” – in Google’s opinion. Too many incoming links too fast might devalue your site, but again. I usually err on the safe side – I always aimed for massive diversity in my links – to make them look ‘more natural’. Honestly, I go for natural links in 2016 full stop, for this website.
  • Google can devalue whole sites, individual pages, template generated links and individual links if Google deems them “unnecessary” and a ‘poor user experience’.
  • Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.
  • Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is well optimised amongst the rest of your pages for your desired key phrase. Always remember Google does not want to rank ‘thin’ pages in results – any page you want to rank – should have all the things Google is looking for. That’s a lot these days!
  • It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.
  • Consider linking to important pages on your site from your home page, and other important pages on your site.
  • Focus n RELEVANCE first. Then, focus your marketing efforts and get REPUTABLE. This is the key to ranking ‘legitimately’ in Google in 2016.
  • Every few months Google changes its algorithm to punish sloppy optimisation or industrial manipulation. Google Panda and Google Penguin are two such updates, but the important thing is to understand Google changes its algorithms constantly to control its listings pages (over 600 changes a year we are told).
  • The art of rank modification is to rank without tripping these algorithms or getting flagged by a human reviewer – and that is tricky!
  • Focus on improving website download speeds at all times. The web is changing very fast, and a fast website is a good user experience.

Welcome to the tightrope that is modern web optimisation.

Chuck Reynolds
Contributor

Important Google Ranking Factor?

Is Domain Age An Important Google Ranking Factor?

No, not in isolation.

Having a ten-year-old domain that Google knows nothing about is the same as having a brand new domain. A 10-year-old site that’s continually cited by, year on year, the actions of other, more authoritative, and trusted sites? That’s valuable. But that’s not the age of your website address ON IT”S OWN in-play as a ranking factor.

A one-year-old domain cited by authority sites is just as valuable if not more valuable than a ten-year-old domain with no links and no search-performance history. Perhaps Domain age may come into play when other factors are considered – but I think Google works very much like this on all levels, with all ‘ranking factors’, and all ranking ‘conditions’.

I don’t think you can consider discovering ‘ranking factors’ without ‘ranking conditions’.

Other Ranking Factors:

  1. Domain age; (NOT ON IT”S OWN)
  2. Length of site domain registration; (I don’t see much benefit ON IT”S OWN even knowing “Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year.”) – paying for a domain in advance just tells others you don’t want anyone else using this domain name, it is not much of an indication that you’re going to do something Google cares about).
  3. Domain registration information was hidden/anonymous; (possibly, under human review if OTHER CONDITIONS are met like looking like a spam site)
  4. Site top level domain (geographical focus, e.g. com versus co.uk); (YES)
  5. Site top level domain (e.g. .com versus .info); (DEPENDS)
  6. Sub domain or root domain? (DEPENDS)
  7. Domain past records (how often it changed IP); (DEPENDS)
  8. Domain past owners (how often the owner was changed) (DEPENDS)
  9. Keywords in the domain; (DEFINITELY – ESPECIALLY EXACT KEYWORD MATCH – although Google has a lot of filters that mute the performance of an exact match domain in 2016))
  10. Domain IP; (DEPENDS – for most, no)
  11. Domain IP neighbours; (DEPENDS – for most, no)
  12. Domain external mentions (non-linked) (I have no idea in 2016)
  13. Geo-targeting settings in Google Webmaster Tools (YES – of course)

Google Penalties For Unnatural Footprints

In 2016, you need to be aware that what works to improve your rank can also get you penalised (faster, and a lot more noticeably).

In particular, the Google web spam team is currently waging a PR war on sites that rely on unnatural links and other ‘manipulative’ tactics (and handing out severe penalties if it detects them). And that’s on top of many algorithms already designed to look for other manipulative tactics (like keyword stuffing or boilerplate spun text across pages).

Google is making sure it takes longer to see results from black and white hat SEO, and intent on ensuring a flux in its SERPs based largely on where the searcher is in the world at the time of the search, and where the business is located near to that searcher. There are some things you cannot directly influence legitimately to improve your rankings, but there is plenty you CAN do to drive more Google traffic to a web page.

Chuck Reynolds
Contributor

Ecosystem for all Entrepreneurs