SEO expert

SEO expert

Rich result eligibility

SEO check"An SEO check involves analyzing a websites performance to identify strengths and weaknesses.

SEO expert - Search query patterns

  • Google search console
  • Google keyword clusters
  • Google SEO best practices
Search Engine Optimisation . By reviewing on-page elements, technical issues, and backlink profiles, businesses can pinpoint areas for improvement and develop strategies to enhance their search rankings."

SEO companies in Australia"SEO companies in Australia offer comprehensive services designed to improve website rankings and increase organic traffic. Best SEO Sydney Agency. By combining technical expertise, creative content strategies, and data analysis, these companies help businesses achieve sustainable growth and build a strong online presence."

SEO company in Sydney"A reputable SEO company in Sydney offers businesses a full suite of optimization services, including keyword research, on-page optimization, and backlink strategies. By delivering customized solutions, these companies help clients achieve higher rankings and long-term success."

SEO company in Sydney"A reputable SEO company in Sydney offers businesses a full suite of optimization services, including keyword research, on-page optimization, and backlink strategies. By delivering customized solutions, these companies help clients achieve higher rankings and long-term success."

SEO company Sydney"An SEO company in Sydney provides businesses with customized optimization solutions that improve search rankings and drive traffic. By offering services like keyword research, content creation, technical audits, and link building, these companies help clients achieve their digital marketing goals."

SEO consultant"A professional SEO consultant provides businesses with the expertise needed to improve search rankings and increase organic traffic. By conducting detailed audits, identifying keyword opportunities, and implementing tailored strategies, these consultants deliver measurable improvements."

Best SEO Audit Services.

Citations and other Useful links

Google Business Profile listing quality

SEO consultant Sydney"A skilled SEO consultant in Sydney provides businesses with personalized guidance and expertise to improve their search rankings. SEO Packages Sydney . By conducting in-depth audits, identifying keyword opportunities, and implementing targeted strategies, these consultants help clients achieve long-term success in their digital marketing efforts."

SEO consultant Sydney"A skilled SEO consultant in Sydney provides businesses with personalized guidance and expertise to improve their search rankings. By conducting in-depth audits, identifying keyword opportunities, and implementing targeted strategies, these consultants help clients achieve long-term success in their digital marketing efforts."

SEO consultants Sydney"Sydney-based SEO consultants provide expert guidance to improve search rankings, increase organic traffic, and drive conversions. With a focus on data-driven strategies and continuous improvement, these consultants help businesses achieve sustainable growth."

Google Business Profile listing quality
Google Business Profile local audience

Google Business Profile local audience

SEO content marketing"SEO content marketing focuses on creating high-quality, keyword-optimized content that drives organic traffic. By combining SEO best practices with engaging storytelling, businesses can improve search rankings, attract more visitors, and convert readers into customers."

SEO conversion optimization"SEO conversion optimization involves creating content and calls-to-action that guide visitors toward completing desired actions. By combining effective SEO techniques with user-friendly design and clear messaging, businesses can increase conversions and maximize the return on their optimization efforts."

SEO copywriting"SEO copywriting combines engaging content with strategic keyword usage. comprehensive SEO Services services.

SEO expert - Search query patterns

  • Rich result eligibility
  • Search intent alignment
By crafting compelling, informative, and keyword-optimized copy, businesses can improve rankings, attract more visitors, and convert readers into customers."

Google Business Profile local citations

SEO cost analysis"SEO cost analysis helps businesses understand the financial investment required for effective optimization. By evaluating tools, services, and time commitments, companies can make informed decisions and ensure they allocate resources efficiently for the best results."

SEO expert"An SEO expert provides businesses with the guidance and strategies needed to improve their search rankings. By analyzing data, optimizing website elements, and implementing proven techniques, these experts help clients achieve sustainable growth and a strong online presence."

SEO expert Sydney"A seasoned SEO expert in Sydney helps businesses navigate the complexities of search engine optimization. By analyzing data, refining strategies, and staying current with algorithm changes, these experts deliver measurable improvements in search rankings and website performance."

Google Business Profile local citations
Google Business Profile local search optimization

SEO expert Sydney"A seasoned SEO expert in Sydney helps businesses navigate the complexities of search engine optimization. By analyzing data, refining strategies, and staying current with algorithm changes, these experts deliver measurable improvements in search rankings and website performance."

SEO experts"SEO experts specialize in improving website performance, increasing organic traffic, and enhancing search rankings. By leveraging advanced techniques, data-driven strategies, and industry best practices, these experts help businesses achieve measurable and lasting results."

SEO experts Sydney"SEO experts in Sydney combine technical knowledge, creative strategies, and industry insights to deliver outstanding results. With a focus on data-driven decision-making and continuous improvement, these experts help businesses achieve and maintain top search rankings."

Google Business Profile local SEO

SEO experts Sydney"SEO experts in Sydney combine technical knowledge, creative strategies, and industry insights to deliver outstanding results. With a focus on data-driven decision-making and continuous improvement, these experts help businesses achieve and maintain top search rankings."

SEO for blogs"SEO for blogs focuses on optimizing individual blog posts with relevant keywords, compelling meta descriptions, and clear headings. By creating well-structured, valuable content, bloggers can attract more readers, rank higher in search results, and build a loyal audience."

SEO for ecommerce"SEO for ecommerce focuses on optimizing product pages, category pages, and site structure to increase visibility and drive sales.

SEO expert - Search intent alignment

  1. Search query patterns
  2. Mobile search optimization
By targeting transactional keywords, improving page speed, and implementing schema markup, ecommerce businesses can attract more customers and increase revenue."

Google Business Profile local SEO

Web design encompasses many different skills and disciplines in the production and maintenance of websites. The different areas of web design include web graphic design; user interface design (UI design); authoring, including standardised code and proprietary software; user experience design (UX design); and search engine optimization. Often many individuals will work in teams covering different aspects of the design process, although some designers will cover them all.[1] The term "web design" is normally used to describe the design process relating to the front-end (client side) design of a website including writing markup. Web design partially overlaps web engineering in the broader scope of web development. Web designers are expected to have an awareness of usability and be up to date with web accessibility guidelines.

History

[edit]
Web design books in a store

1988–2001

[edit]

Although web design has a fairly recent history, it can be linked to other areas such as graphic design, user experience, and multimedia arts, but is more aptly seen from a technological standpoint. It has become a large part of people's everyday lives. It is hard to imagine the Internet without animated graphics, different styles of typography, backgrounds, videos and music. The web was announced on August 6, 1991; in November 1992, CERN was the first website to go live on the World Wide Web. During this period, websites were structured by using the <table> tag which created numbers on the website. Eventually, web designers were able to find their way around it to create more structures and formats. In early history, the structure of the websites was fragile and hard to contain, so it became very difficult to use them. In November 1993, ALIWEB was the first ever search engine to be created (Archie Like Indexing for the WEB).[2]

The start of the web and web design

[edit]

In 1989, whilst working at CERN in Switzerland, British scientist Tim Berners-Lee proposed to create a global hypertext project, which later became known as the World Wide Web. From 1991 to 1993 the World Wide Web was born. Text-only HTML pages could be viewed using a simple line-mode web browser.[3] In 1993 Marc Andreessen and Eric Bina, created the Mosaic browser. At the time there were multiple browsers, however the majority of them were Unix-based and naturally text-heavy. There had been no integrated approach to graphic design elements such as images or sounds. The Mosaic browser broke this mould.[4] The W3C was created in October 1994 to "lead the World Wide Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability."[5] This discouraged any one company from monopolizing a proprietary browser and programming language, which could have altered the effect of the World Wide Web as a whole. The W3C continues to set standards, which can today be seen with JavaScript and other languages. In 1994 Andreessen formed Mosaic Communications Corp. that later became known as Netscape Communications, the Netscape 0.9 browser. Netscape created its HTML tags without regard to the traditional standards process. For example, Netscape 1.1 included tags for changing background colours and formatting text with tables on web pages. From 1996 to 1999 the browser wars began, as Microsoft and Netscape fought for ultimate browser dominance. During this time there were many new technologies in the field, notably Cascading Style Sheets, JavaScript, and Dynamic HTML. On the whole, the browser competition did lead to many positive creations and helped web design evolve at a rapid pace.[6]

Evolution of web design

[edit]

In 1996, Microsoft released its first competitive browser, which was complete with its features and HTML tags. It was also the first browser to support style sheets, which at the time was seen as an obscure authoring technique and is today an important aspect of web design.[6] The HTML markup for tables was originally intended for displaying tabular data. However, designers quickly realized the potential of using HTML tables for creating complex, multi-column layouts that were otherwise not possible. At this time, as design and good aesthetics seemed to take precedence over good markup structure, little attention was paid to semantics and web accessibility. HTML sites were limited in their design options, even more so with earlier versions of HTML. To create complex designs, many web designers had to use complicated table structures or even use blank spacer .GIF images to stop empty table cells from collapsing.[7] CSS was introduced in December 1996 by the W3C to support presentation and layout. This allowed HTML code to be semantic rather than both semantic and presentational and improved web accessibility, see tableless web design.

In 1996, Flash (originally known as FutureSplash) was developed. At the time, the Flash content development tool was relatively simple compared to now, using basic layout and drawing tools, a limited precursor to ActionScript, and a timeline, but it enabled web designers to go beyond the point of HTML, animated GIFs and JavaScript. However, because Flash required a plug-in, many web developers avoided using it for fear of limiting their market share due to lack of compatibility. Instead, designers reverted to GIF animations (if they did not forego using motion graphics altogether) and JavaScript for widgets. But the benefits of Flash made it popular enough among specific target markets to eventually work its way to the vast majority of browsers, and powerful enough to be used to develop entire sites.[7]

End of the first browser wars

[edit]

In 1998, Netscape released Netscape Communicator code under an open-source licence, enabling thousands of developers to participate in improving the software. However, these developers decided to start a standard for the web from scratch, which guided the development of the open-source browser and soon expanded to a complete application platform.[6] The Web Standards Project was formed and promoted browser compliance with HTML and CSS standards. Programs like Acid1, Acid2, and Acid3 were created in order to test browsers for compliance with web standards. In 2000, Internet Explorer was released for Mac, which was the first browser that fully supported HTML 4.01 and CSS 1. It was also the first browser to fully support the PNG image format.[6] By 2001, after a campaign by Microsoft to popularize Internet Explorer, Internet Explorer had reached 96% of web browser usage share, which signified the end of the first browser wars as Internet Explorer had no real competition.[8]

2001–2012

[edit]

Since the start of the 21st century, the web has become more and more integrated into people's lives. As this has happened the technology of the web has also moved on. There have also been significant changes in the way people use and access the web, and this has changed how sites are designed.

Since the end of the browsers wars[when?] new browsers have been released. Many of these are open source, meaning that they tend to have faster development and are more supportive of new standards. The new options are considered by many[weasel words] to be better than Microsoft's Internet Explorer.

The W3C has released new standards for HTML (HTML5) and CSS (CSS3), as well as new JavaScript APIs, each as a new but individual standard.[when?] While the term HTML5 is only used to refer to the new version of HTML and some of the JavaScript APIs, it has become common to use it to refer to the entire suite of new standards (HTML5, CSS3 and JavaScript).

2012 and later

[edit]

With the advancements in 3G and LTE internet coverage, a significant portion of website traffic shifted to mobile devices. This shift influenced the web design industry, steering it towards a minimalist, lighter, and more simplistic style. The "mobile first" approach emerged as a result, emphasizing the creation of website designs that prioritize mobile-oriented layouts first, before adapting them to larger screen dimensions.

Tools and technologies

[edit]

Web designers use a variety of different tools depending on what part of the production process they are involved in. These tools are updated over time by newer standards and software but the principles behind them remain the same. Web designers use both vector and raster graphics editors to create web-formatted imagery or design prototypes. A website can be created using WYSIWYG website builder software or a content management system, or the individual web pages can be hand-coded in just the same manner as the first web pages were created. Other tools web designers might use include markup validators[9] and other testing tools for usability and accessibility to ensure their websites meet web accessibility guidelines.[10]

UX Design

[edit]

One popular tool in web design is UX Design, a type of art that designs products to perform an accurate user background. UX design is very deep. UX is more than the web, it is very independent, and its fundamentals can be applied to many other browsers or apps. Web design is mostly based on web-based things. UX can overlap both web design and design. UX design mostly focuses on products that are less web-based.[11]

Skills and techniques

[edit]

Marketing and communication design

[edit]

Marketing and communication design on a website may identify what works for its target market. This can be an age group or particular strand of culture; thus the designer may understand the trends of its audience. Designers may also understand the type of website they are designing, meaning, for example, that (B2B) business-to-business website design considerations might differ greatly from a consumer-targeted website such as a retail or entertainment website. Careful consideration might be made to ensure that the aesthetics or overall design of a site do not clash with the clarity and accuracy of the content or the ease of web navigation,[12] especially on a B2B website. Designers may also consider the reputation of the owner or business the site is representing to make sure they are portrayed favorably. Web designers normally oversee all the websites that are made on how they work or operate on things. They constantly are updating and changing everything on websites behind the scenes. All the elements they do are text, photos, graphics, and layout of the web. Before beginning work on a website, web designers normally set an appointment with their clients to discuss layout, colour, graphics, and design. Web designers spend the majority of their time designing websites and making sure the speed is right. Web designers typically engage in testing and working, marketing, and communicating with other designers about laying out the websites and finding the right elements for the websites.[13]

User experience design and interactive design

[edit]

User understanding of the content of a website often depends on user understanding of how the website works. This is part of the user experience design. User experience is related to layout, clear instructions, and labeling on a website. How well a user understands how they can interact on a site may also depend on the interactive design of the site. If a user perceives the usefulness of the website, they are more likely to continue using it. Users who are skilled and well versed in website use may find a more distinctive, yet less intuitive or less user-friendly website interface useful nonetheless. However, users with less experience are less likely to see the advantages or usefulness of a less intuitive website interface. This drives the trend for a more universal user experience and ease of access to accommodate as many users as possible regardless of user skill.[14] Much of the user experience design and interactive design are considered in the user interface design.

Advanced interactive functions may require plug-ins if not advanced coding language skills. Choosing whether or not to use interactivity that requires plug-ins is a critical decision in user experience design. If the plug-in doesn't come pre-installed with most browsers, there's a risk that the user will have neither the know-how nor the patience to install a plug-in just to access the content. If the function requires advanced coding language skills, it may be too costly in either time or money to code compared to the amount of enhancement the function will add to the user experience. There's also a risk that advanced interactivity may be incompatible with older browsers or hardware configurations. Publishing a function that doesn't work reliably is potentially worse for the user experience than making no attempt. It depends on the target audience if it's likely to be needed or worth any risks.

Progressive enhancement

[edit]
The order of progressive enhancement

Progressive enhancement is a strategy in web design that puts emphasis on web content first, allowing everyone to access the basic content and functionality of a web page, whilst users with additional browser features or faster Internet access receive the enhanced version instead.

In practice, this means serving content through HTML and applying styling and animation through CSS to the technically possible extent, then applying further enhancements through JavaScript. Pages' text is loaded immediately through the HTML source code rather than having to wait for JavaScript to initiate and load the content subsequently, which allows content to be readable with minimum loading time and bandwidth, and through text-based browsers, and maximizes backwards compatibility.[15]

As an example, MediaWiki-based sites including Wikipedia use progressive enhancement, as they remain usable while JavaScript and even CSS is deactivated, as pages' content is included in the page's HTML source code, whereas counter-example Everipedia relies on JavaScript to load pages' content subsequently; a blank page appears with JavaScript deactivated.

Page layout

[edit]

Part of the user interface design is affected by the quality of the page layout. For example, a designer may consider whether the site's page layout should remain consistent on different pages when designing the layout. Page pixel width may also be considered vital for aligning objects in the layout design. The most popular fixed-width websites generally have the same set width to match the current most popular browser window, at the current most popular screen resolution, on the current most popular monitor size. Most pages are also center-aligned for concerns of aesthetics on larger screens.

Fluid layouts increased in popularity around 2000 to allow the browser to make user-specific layout adjustments to fluid layouts based on the details of the reader's screen (window size, font size relative to window, etc.). They grew as an alternative to HTML-table-based layouts and grid-based design in both page layout design principles and in coding technique but were very slow to be adopted.[note 1] This was due to considerations of screen reading devices and varying windows sizes which designers have no control over. Accordingly, a design may be broken down into units (sidebars, content blocks, embedded advertising areas, navigation areas) that are sent to the browser and which will be fitted into the display window by the browser, as best it can. Although such a display may often change the relative position of major content units, sidebars may be displaced below body text rather than to the side of it. This is a more flexible display than a hard-coded grid-based layout that doesn't fit the device window. In particular, the relative position of content blocks may change while leaving the content within the block unaffected. This also minimizes the user's need to horizontally scroll the page.

Responsive web design is a newer approach, based on CSS3, and a deeper level of per-device specification within the page's style sheet through an enhanced use of the CSS @media rule. In March 2018 Google announced they would be rolling out mobile-first indexing.[16] Sites using responsive design are well placed to ensure they meet this new approach.

Typography

[edit]

Web designers may choose to limit the variety of website typefaces to only a few which are of a similar style, instead of using a wide range of typefaces or type styles. Most browsers recognize a specific number of safe fonts, which designers mainly use in order to avoid complications.

Font downloading was later included in the CSS3 fonts module and has since been implemented in Safari 3.1, Opera 10, and Mozilla Firefox 3.5. This has subsequently increased interest in web typography, as well as the usage of font downloading.

Most site layouts incorporate negative space to break the text up into paragraphs and also avoid center-aligned text.[17]

Motion graphics

[edit]

The page layout and user interface may also be affected by the use of motion graphics. The choice of whether or not to use motion graphics may depend on the target market for the website. Motion graphics may be expected or at least better received with an entertainment-oriented website. However, a website target audience with a more serious or formal interest (such as business, community, or government) might find animations unnecessary and distracting if only for entertainment or decoration purposes. This doesn't mean that more serious content couldn't be enhanced with animated or video presentations that is relevant to the content. In either case, motion graphic design may make the difference between more effective visuals or distracting visuals.

Motion graphics that are not initiated by the site visitor can produce accessibility issues. The World Wide Web consortium accessibility standards require that site visitors be able to disable the animations.[18]

Quality of code

[edit]

Website designers may consider it to be good practice to conform to standards. This is usually done via a description specifying what the element is doing. Failure to conform to standards may not make a website unusable or error-prone, but standards can relate to the correct layout of pages for readability as well as making sure coded elements are closed appropriately. This includes errors in code, a more organized layout for code, and making sure IDs and classes are identified properly. Poorly coded pages are sometimes colloquially called tag soup. Validating via W3C[9] can only be done when a correct DOCTYPE declaration is made, which is used to highlight errors in code. The system identifies the errors and areas that do not conform to web design standards. This information can then be corrected by the user.[19]

Generated content

[edit]

There are two ways websites are generated: statically or dynamically.

Static websites

[edit]

A static website stores a unique file for every page of a static website. Each time that page is requested, the same content is returned. This content is created once, during the design of the website. It is usually manually authored, although some sites use an automated creation process, similar to a dynamic website, whose results are stored long-term as completed pages. These automatically created static sites became more popular around 2015, with generators such as Jekyll and Adobe Muse.[20]

The benefits of a static website are that they were simpler to host, as their server only needed to serve static content, not execute server-side scripts. This required less server administration and had less chance of exposing security holes. They could also serve pages more quickly, on low-cost server hardware. This advantage became less important as cheap web hosting expanded to also offer dynamic features, and virtual servers offered high performance for short intervals at low cost.

Almost all websites have some static content, as supporting assets such as images and style sheets are usually static, even on a website with highly dynamic pages.

Dynamic websites

[edit]

Dynamic websites are generated on the fly and use server-side technology to generate web pages. They typically extract their content from one or more back-end databases: some are database queries across a relational database to query a catalog or to summarise numeric information, and others may use a document database such as MongoDB or NoSQL to store larger units of content, such as blog posts or wiki articles.

In the design process, dynamic pages are often mocked-up or wireframed using static pages. The skillset needed to develop dynamic web pages is much broader than for a static page, involving server-side and database coding as well as client-side interface design. Even medium-sized dynamic projects are thus almost always a team effort.

When dynamic web pages first developed, they were typically coded directly in languages such as Perl, PHP or ASP. Some of these, notably PHP and ASP, used a 'template' approach where a server-side page resembled the structure of the completed client-side page, and data was inserted into places defined by 'tags'. This was a quicker means of development than coding in a purely procedural coding language such as Perl.

Both of these approaches have now been supplanted for many websites by higher-level application-focused tools such as content management systems. These build on top of general-purpose coding platforms and assume that a website exists to offer content according to one of several well-recognised models, such as a time-sequenced blog, a thematic magazine or news site, a wiki, or a user forum. These tools make the implementation of such a site very easy, and a purely organizational and design-based task, without requiring any coding.

Editing the content itself (as well as the template page) can be done both by means of the site itself and with the use of third-party software. The ability to edit all pages is provided only to a specific category of users (for example, administrators, or registered users). In some cases, anonymous users are allowed to edit certain web content, which is less frequent (for example, on forums - adding messages). An example of a site with an anonymous change is Wikipedia.

Homepage design

[edit]

Usability experts, including Jakob Nielsen and Kyle Soucy, have often emphasised homepage design for website success and asserted that the homepage is the most important page on a website.[21] Nielsen, Jakob; Tahir, Marie (October 2001), Homepage Usability: 50 Websites Deconstructed, New Riders Publishing, ISBN 978-0-7357-1102-0[22][23] However practitioners into the 2000s were starting to find that a growing number of website traffic was bypassing the homepage, going directly to internal content pages through search engines, e-newsletters and RSS feeds.[24] This led many practitioners to argue that homepages are less important than most people think.[25][26][27][28] Jared Spool argued in 2007 that a site's homepage was actually the least important page on a website.[29]

In 2012 and 2013, carousels (also called 'sliders' and 'rotating banners') have become an extremely popular design element on homepages, often used to showcase featured or recent content in a confined space.[30] Many practitioners argue that carousels are an ineffective design element and hurt a website's search engine optimisation and usability.[30][31][32]

Occupations

[edit]

There are two primary jobs involved in creating a website: the web designer and web developer, who often work closely together on a website.[33] The web designers are responsible for the visual aspect, which includes the layout, colouring, and typography of a web page. Web designers will also have a working knowledge of markup languages such as HTML and CSS, although the extent of their knowledge will differ from one web designer to another. Particularly in smaller organizations, one person will need the necessary skills for designing and programming the full web page, while larger organizations may have a web designer responsible for the visual aspect alone.

Further jobs which may become involved in the creation of a website include:

  • Graphic designers to create visuals for the site such as logos, layouts, and buttons
  • Internet marketing specialists to help maintain web presence through strategic solutions on targeting viewers to the site, by using marketing and promotional techniques on the internet
  • SEO writers to research and recommend the correct words to be incorporated into a particular website and make the website more accessible and found on numerous search engines
  • Internet copywriter to create the written content of the page to appeal to the targeted viewers of the site[1]
  • User experience (UX) designer incorporates aspects of user-focused design considerations which include information architecture, user-centred design, user testing, interaction design, and occasionally visual design.

Artificial intelligence and web design

[edit]

Chat GPT and other AI models are being used to write and code websites making it faster and easier to create websites. There are still discussions about the ethical implications on using artificial intelligence for design as the world becomes more familiar with using AI for time-consuming tasks used in design processes.[34]

See also

[edit]
[edit]

Notes

[edit]
  1. ^ <table>-based markup and spacer .GIF images

References

[edit]
  1. ^ a b Lester, Georgina. "Different jobs and responsibilities of various people involved in creating a website". Arts Wales UK. Retrieved 2012-03-17.
  2. ^ CPBI, Ryan Shelley. "The History of Website Design: 30 Years of Building the Web [2022 Update]". www.smamarketing.net. Retrieved 2022-10-12.
  3. ^ "Longer Biography". Retrieved 2012-03-16.
  4. ^ "Mosaic Browser" (PDF). Archived from the original (PDF) on 2013-09-02. Retrieved 2012-03-16.
  5. ^ Zwicky, E.D; Cooper, S; Chapman, D.B. (2000). Building Internet Firewalls. United States: O'Reily & Associates. p. 804. ISBN 1-56592-871-7.
  6. ^ a b c d Niederst, Jennifer (2006). Web Design In a Nutshell. United States of America: O'Reilly Media. pp. 12–14. ISBN 0-596-00987-9.
  7. ^ a b Chapman, Cameron, The Evolution of Web Design, Six Revisions, archived from the original on 30 October 2013
  8. ^ "AMO.NET America's Multimedia Online (Internet Explorer 6 PREVIEW)". amo.net. Retrieved 2020-05-27.
  9. ^ a b "W3C Markup Validation Service".
  10. ^ W3C. "Web Accessibility Initiative (WAI)".cite web: CS1 maint: numeric names: authors list (link)
  11. ^ "What is Web Design?". The Interaction Design Foundation. Retrieved 2022-10-12.
  12. ^ THORLACIUS, LISBETH (2007). "The Role of Aesthetics in Web Design". Nordicom Review. 28 (28): 63–76. doi:10.1515/nor-2017-0201. S2CID 146649056.
  13. ^ "What is a Web Designer? (2022 Guide)". BrainStation®. Retrieved 2022-10-28.
  14. ^ Castañeda, J.A Francisco; Muñoz-Leiva, Teodoro Luque (2007). "Web Acceptance Model (WAM): Moderating effects of user experience". Information & Management. 44 (4): 384–396. doi:10.1016/j.im.2007.02.003.
  15. ^ "Building a resilient frontend using progressive enhancement". GOV.UK. Retrieved 27 October 2021.
  16. ^ "Rolling out mobile-first indexing". Official Google Webmaster Central Blog. Retrieved 2018-06-09.
  17. ^ Stone, John (2009-11-16). "20 Do's and Don'ts of Effective Web Typography". Retrieved 2012-03-19.
  18. ^ World Wide Web Consortium: Understanding Web Content Accessibility Guidelines 2.2.2: Pause, Stop, Hide
  19. ^ W3C QA. "My Web site is standard! And yours?". Retrieved 2012-03-21.cite web: CS1 maint: numeric names: authors list (link)
  20. ^ Christensen, Mathias Biilmann (2015-11-16). "Static Website Generators Reviewed: Jekyll, Middleman, Roots, Hugo". Smashing Magazine. Retrieved 2016-10-26.
  21. ^ Soucy, Kyle, Is Your Homepage Doing What It Should?, Usable Interface, archived from the original on 8 June 2012
  22. ^ Nielsen, Jakob (10 November 2003), The Ten Most Violated Homepage Design Guidelines, Nielsen Norman Group, archived from the original on 5 October 2013
  23. ^ Knight, Kayla (20 August 2009), Essential Tips for Designing an Effective Homepage, Six Revisions, archived from the original on 21 August 2013
  24. ^ Spool, Jared (29 September 2005), Is Home Page Design Relevant Anymore?, User Interface Engineering, archived from the original on 16 September 2013
  25. ^ Chapman, Cameron (15 September 2010), 10 Usability Tips Based on Research Studies, Six Revisions, archived from the original on 2 September 2013
  26. ^ Gócza, Zoltán, Myth #17: The homepage is your most important page, archived from the original on 2 June 2013
  27. ^ McGovern, Gerry (18 April 2010), The decline of the homepage, archived from the original on 24 May 2013
  28. ^ Porter, Joshua (24 April 2006), Prioritizing Design Time: A Long Tail Approach, User Interface Engineering, archived from the original on 14 May 2013
  29. ^ Spool, Jared (6 August 2007), Usability Tools Podcast: Home Page Design, archived from the original on 29 April 2013
  30. ^ a b Messner, Katie (22 April 2013), Image Carousels: Getting Control of the Merry-Go-Round, Usability.gov, archived from the original on 10 October 2013
  31. ^ Jones, Harrison (19 June 2013), Homepage Sliders: Bad For SEO, Bad For Usability, archived from the original on 22 November 2013
  32. ^ Laja, Peep (8 June 2019), Image Carousels and Sliders? Don't Use Them. (Here's why.), CXL, archived from the original on 10 December 2019
  33. ^ Oleksy, Walter (2001). Careers in Web Design. New York: The Rosen Publishing Group, Inc. pp. 9–11. ISBN 978-0-8239-3191-0.
  34. ^ Visser, Larno, et al. ChatGPT for Web Design : Create Amazing Websites. [First edition]., PACKT Publishing, 2023.
[edit]

 

(Learn how and when to remove this message)

 

Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid search traffic (usually referred to as "organic" results) rather than direct traffic, referral traffic, social media traffic, or paid traffic.

Unpaid search engine traffic may originate from a variety of kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.

As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine results, what people search for, the actual search queries or keywords typed into search engines, and which search engines are preferred by a target audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher within a search engine results page (SERP), with the aim of either converting the visitors or building brand awareness.[4]

History

[edit]

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, webmasters submitted the address of a page, or URL to the various search engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5]

According to a 2004 article by former industry analyst and current Google employee Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits SEO practitioner Bruce Clay as one of the first people to popularize the term.[6]

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[7][dubiousdiscuss] Web content providers also manipulated attributes within the HTML source of a page in an attempt to rank well in search engines.[8] By 1997, search engine designers recognized that webmasters were making efforts to rank in search engines and that some webmasters were manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[9]

By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[10]

Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.[citation needed]

Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[11][12] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[13] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.

In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[14]

Relationship with Google

[edit]

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[15] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

Page and Brin founded Google in 1998.[16] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[17] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes involved the creation of thousands of sites for the sole purpose of link spamming.[18]

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[19] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[20] Patents related to search engines can provide information to better understand search engines.[21] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[22]

In 2007, Google announced a campaign against paid links that transfer PageRank.[23] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[24] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[25]

In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[26] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[27] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[28]

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[29] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[30] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[31] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[32] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[33] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.

Methods

[edit]

Getting indexed

[edit]
A simple illustration of the Pagerank algorithm. Percentage shows the perceived importance.

The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[34] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[35] in addition to their URL submission console.[36] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[37] however, this practice was discontinued in 2009.

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[38]

Mobile devices are used for the majority of Google searches.[39] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[40] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[41] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[42]

Preventing crawling

[edit]

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[43]

In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint rather than a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[44]

Increasing prominence

[edit]

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[45]

Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[46] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[45]

White hat versus black hat techniques

[edit]
Common white-hat methods of search engine optimization

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[47] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[48]

An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[11][12][49] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[50] although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.

Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[51] Both companies subsequently apologized, fixed the offending pages, and were restored to Google's search engine results page.[52]

Companies that employ black hat techniques or other spammy tactics can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[53] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[54] Google's Matt Cutts later confirmed that Google had banned Traffic Power and some of its clients.[55]

As marketing strategy

[edit]

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals.[editorializing] Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[56] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[57][58] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[60] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[45]

SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.

International markets and SEO

[edit]

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and data showed Google was the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of March 2024, Google still had a significant market share of 89.85% in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68][obsolete source] As of March 2024, Google's market share in the UK was 93.61%.[69]

Successful search engine optimization (SEO) for international markets requires more than just translating web pages. It may also involve registering a domain name with a country-code top-level domain (ccTLD) or a relevant top-level domain (TLD) for the target market, choosing web hosting with a local IP address or server, and using a Content Delivery Network (CDN) to improve website speed and performance globally. It is also important to understand the local culture so that the content feels relevant to the audience. This includes conducting keyword research for each market, using hreflang tags to target the right languages, and building local backlinks. However, the core SEO principles—such as creating high-quality content, improving user experience, and building links—remain the same, regardless of language or region.[66]

Regional search engines have a strong presence in specific markets:

  • China: Baidu leads the market, controlling about 70 to 80% market share.[70]
  • South Korea: Since the end of 2021, Naver, a domestic web portal, has gained prominence in the country.[71][72]
  • Russia: Yandex is the leading search engine in Russia. As of December 2023, it accounted for at least 63.8% of the market share.[73]

The Evolution of International SEO

[edit]

By the early 2000s, businesses recognized that the web and search engines could help them reach global audiences. As a result, the need for multilingual SEO emerged.[74] In the early years of international SEO development, simple translation was seen as sufficient. However, over time, it became clear that localization and transcreation—adapting content to local language, culture, and emotional resonance—were far more effective than basic translation.[75]

[edit]

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[76][77]

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[78][79]

See also

[edit]

References

[edit]
  1. ^ "SEO – search engine optimization". Webopedia. December 19, 2001. Archived from the original on May 9, 2019. Retrieved May 9, 2019.
  2. ^ Giomelakis, Dimitrios; Veglis, Andreas (April 2, 2016). "Investigating Search Engine Optimization Factors in Media Websites: The case of Greece". Digital Journalism. 4 (3): 379–400. doi:10.1080/21670811.2015.1046992. ISSN 2167-0811. S2CID 166902013. Archived from the original on October 30, 2022. Retrieved October 30, 2022.
  3. ^ Beel, Jöran; Gipp, Bela; Wilde, Erik (2010). "Academic Search Engine Optimization (ASEO): Optimizing Scholarly Literature for Google Scholar and Co" (PDF). Journal of Scholarly Publishing. pp. 176–190. Archived from the original (PDF) on November 18, 2017. Retrieved April 18, 2010.
  4. ^ Ortiz-Cordova, A. and Jansen, B. J. (2012) Classifying Web Search Queries in Order to Identify High Revenue Generating Customers. Archived March 4, 2016, at the Wayback Machine. Journal of the American Society for Information Sciences and Technology. 63(7), 1426 – 1441.
  5. ^ Brian Pinkerton. "Finding What People Want: Experiences with the WebCrawler" (PDF). The Second International WWW Conference Chicago, USA, October 17–20, 1994. Archived (PDF) from the original on May 8, 2007. Retrieved May 7, 2007.
  6. ^ Danny Sullivan (June 14, 2004). "Who Invented the Term "Search Engine Optimization"?". Search Engine Watch. Archived from the original on April 23, 2010. Retrieved May 14, 2007. See Google groups thread Archived June 17, 2013, at the Wayback Machine.
  7. ^ "The Challenge is Open", Brain vs Computer, WORLD SCIENTIFIC, November 17, 2020, pp. 189–211, doi:10.1142/9789811225017_0009, ISBN 978-981-12-2500-0, S2CID 243130517
  8. ^ Pringle, G.; Allison, L.; Dowe, D. (April 1998). "What is a tall poppy among web pages?". Monash University. Archived from the original on April 27, 2007. Retrieved May 8, 2007.
  9. ^ Laurie J. Flynn (November 11, 1996). "Desperately Seeking Surfers". New York Times. Archived from the original on October 30, 2007. Retrieved May 9, 2007.
  10. ^ Jason Demers (January 20, 2016). "Is Keyword Density Still Important for SEO". Forbes. Archived from the original on August 16, 2016. Retrieved August 15, 2016.
  11. ^ a b "Google's Guidelines on Site Design". Archived from the original on January 9, 2009. Retrieved April 18, 2007.
  12. ^ a b "Bing Webmaster Guidelines". bing.com. Archived from the original on September 9, 2014. Retrieved September 11, 2014.
  13. ^ "Sitemaps". Archived from the original on June 22, 2023. Retrieved July 4, 2012.
  14. ^ ""By the Data: For Consumers, Mobile is the Internet" Google for Entrepreneurs Startup Grind September 20, 2015". Archived from the original on January 6, 2016. Retrieved January 8, 2016.
  15. ^ Brin, Sergey & Page, Larry (1998). "The Anatomy of a Large-Scale Hypertextual Web Search Engine". Proceedings of the seventh international conference on World Wide Web. pp. 107–117. Archived from the original on October 10, 2006. Retrieved May 8, 2007.
  16. ^ "Co-founders of Google - Google's co-founders may not have the name recognition of say, Bill Gates, but give them time: Google hasn't been around nearly as long as Microsoft". Entrepreneur. October 15, 2008. Archived from the original on May 31, 2014. Retrieved May 30, 2014.
  17. ^ Thompson, Bill (December 19, 2003). "Is Google good for you?". BBC News. Archived from the original on January 25, 2009. Retrieved May 16, 2007.
  18. ^ Zoltan Gyongyi & Hector Garcia-Molina (2005). "Link Spam Alliances" (PDF). Proceedings of the 31st VLDB Conference, Trondheim, Norway. Archived (PDF) from the original on June 12, 2007. Retrieved May 9, 2007.
  19. ^ Hansell, Saul (June 3, 2007). "Google Keeps Tweaking Its Search Engine". New York Times. Archived from the original on November 10, 2017. Retrieved June 6, 2007.
  20. ^ Sullivan, Danny (September 29, 2005). "Rundown On Search Ranking Factors". Search Engine Watch. Archived from the original on May 28, 2007. Retrieved May 8, 2007.
  21. ^ Christine Churchill (November 23, 2005). "Understanding Search Engine Patents". Search Engine Watch. Archived from the original on February 7, 2007. Retrieved May 8, 2007.
  22. ^ "Google Personalized Search Leaves Google Labs". searchenginewatch.com. Search Engine Watch. Archived from the original on January 25, 2009. Retrieved September 5, 2009.
  23. ^ "8 Things We Learned About Google PageRank". www.searchenginejournal.com. October 25, 2007. Archived from the original on August 19, 2009. Retrieved August 17, 2009.
  24. ^ "PageRank sculpting". Matt Cutts. Archived from the original on January 6, 2010. Retrieved January 12, 2010.
  25. ^ "Google Loses "Backwards Compatibility" On Paid Link Blocking & PageRank Sculpting". searchengineland.com. June 3, 2009. Archived from the original on August 14, 2009. Retrieved August 17, 2009.
  26. ^ "Personalized Search for everyone". Archived from the original on December 8, 2009. Retrieved December 14, 2009.
  27. ^ "Our new search index: Caffeine". Google: Official Blog. Archived from the original on June 18, 2010. Retrieved May 10, 2014.
  28. ^ "Relevance Meets Real-Time Web". Google Blog. Archived from the original on April 7, 2019. Retrieved January 4, 2010.
  29. ^ "Google Search Quality Updates". Google Blog. Archived from the original on April 23, 2022. Retrieved March 21, 2012.
  30. ^ "What You Need to Know About Google's Penguin Update". Inc.com. June 20, 2012. Archived from the original on December 20, 2012. Retrieved December 6, 2012.
  31. ^ "Google Penguin looks mostly at your link source, says Google". Search Engine Land. October 10, 2016. Archived from the original on April 21, 2017. Retrieved April 20, 2017.
  32. ^ "FAQ: All About The New Google "Hummingbird" Algorithm". www.searchengineland.com. September 26, 2013. Archived from the original on December 23, 2018. Retrieved March 17, 2018.
  33. ^ "Understanding searches better than ever before". Google. October 25, 2019. Archived from the original on January 27, 2021. Retrieved May 12, 2020.
  34. ^ "Submitting To Directories: Yahoo & The Open Directory". Search Engine Watch. March 12, 2007. Archived from the original on May 19, 2007. Retrieved May 15, 2007.
  35. ^ "What is a Sitemap file and why should I have one?". Archived from the original on July 1, 2007. Retrieved March 19, 2007.
  36. ^ "Search Console - Crawl URL". Archived from the original on August 14, 2022. Retrieved December 18, 2015.
  37. ^ Sullivan, Danny (March 12, 2007). "Submitting To Search Crawlers: Google, Yahoo, Ask & Microsoft's Live Search". Search Engine Watch. Archived from the original on May 10, 2007. Retrieved May 15, 2007.
  38. ^ Cho, J.; Garcia-Molina, H.; Page, L. (1998). "Efficient crawling through URL ordering". Seventh International World-Wide Web Conference. Brisbane, Australia: Stanford InfoLab Publication Server. Archived from the original on July 14, 2019. Retrieved May 9, 2007.
  39. ^ "Mobile-first Index". Archived from the original on February 22, 2019. Retrieved March 19, 2018.
  40. ^ Phan, Doantam (November 4, 2016). "Mobile-first Indexing". Official Google Webmaster Central Blog. Archived from the original on February 22, 2019. Retrieved January 16, 2019.
  41. ^ "The new evergreen Googlebot". Official Google Webmaster Central Blog. Archived from the original on November 6, 2020. Retrieved March 2, 2020.
  42. ^ "Updating the user agent of Googlebot". Official Google Webmaster Central Blog. Archived from the original on March 2, 2020. Retrieved March 2, 2020.
  43. ^ "Newspapers Amok! New York Times Spamming Google? LA Times Hijacking Cars.com?". Search Engine Land. May 8, 2007. Archived from the original on December 26, 2008. Retrieved May 9, 2007.
  44. ^ Jill Kocher Brown (February 24, 2020). "Google Downgrades Nofollow Directive. Now What?". Practical Ecommerce. Archived from the original on January 25, 2021. Retrieved February 11, 2021.
  45. ^ a b c Morey, Sean (2008). The Digital Writer. Fountainhead Press. pp. 171–187.
  46. ^ "Bing – Partnering to help solve duplicate content issues – Webmaster Blog – Bing Community". www.bing.com. February 12, 2009. Archived from the original on June 7, 2014. Retrieved October 30, 2009.
  47. ^ Andrew Goodman. "Search Engine Showdown: Black hats vs. White hats at SES". SearchEngineWatch. Archived from the original on February 22, 2007. Retrieved May 9, 2007.
  48. ^ Jill Whalen (November 16, 2004). "Black Hat/White Hat Search Engine Optimization". searchengineguide.com. Archived from the original on November 17, 2004. Retrieved May 9, 2007.
  49. ^ "What's an SEO? Does Google recommend working with companies that offer to make my site Google-friendly?". Archived from the original on April 16, 2006. Retrieved April 18, 2007.
  50. ^ Andy Hagans (November 8, 2005). "High Accessibility Is Effective Search Engine Optimization". A List Apart. Archived from the original on May 4, 2007. Retrieved May 9, 2007.
  51. ^ Matt Cutts (February 4, 2006). "Ramping up on international webspam". mattcutts.com/blog. Archived from the original on June 29, 2012. Retrieved May 9, 2007.
  52. ^ Matt Cutts (February 7, 2006). "Recent reinclusions". mattcutts.com/blog. Archived from the original on May 22, 2007. Retrieved May 9, 2007.
  53. ^ David Kesmodel (September 22, 2005). "Sites Get Dropped by Search Engines After Trying to 'Optimize' Rankings". Wall Street Journal. Archived from the original on August 4, 2020. Retrieved July 30, 2008.
  54. ^ Adam L. Penenberg (September 8, 2005). "Legal Showdown in Search Fracas". Wired Magazine. Archived from the original on March 4, 2016. Retrieved August 11, 2016.
  55. ^ Matt Cutts (February 2, 2006). "Confirming a penalty". mattcutts.com/blog. Archived from the original on June 26, 2012. Retrieved May 9, 2007.
  56. ^ Tapan, Panda (2013). "Search Engine Marketing: Does the Knowledge Discovery Process Help Online Retailers?". IUP Journal of Knowledge Management. 11 (3): 56–66. ProQuest 1430517207.
  57. ^ Melissa Burdon (March 13, 2007). "The Battle Between Search Engine Optimization and Conversion: Who Wins?". Grok.com. Archived from the original on March 15, 2008. Retrieved April 10, 2017.
  58. ^ "SEO Tips and Marketing Strategies". Archived from the original on October 30, 2022. Retrieved October 30, 2022.
  59. ^ ""Search Quality Evaluator Guidelines" How Search Works November 12, 2015" (PDF). Archived (PDF) from the original on March 29, 2019. Retrieved January 11, 2016.
  60. ^ Titcomb, James (November 2016). "Mobile web usage overtakes desktop for first time". The Telegraph. Archived from the original on January 10, 2022. Retrieved March 17, 2018.
  61. ^ Andy Greenberg (April 30, 2007). "Condemned To Google Hell". Forbes. Archived from the original on May 2, 2007. Retrieved May 9, 2007.
  62. ^ Matt McGee (September 21, 2011). "Schmidt's testimony reveals how Google tests algorithm changes". Archived from the original on January 17, 2012. Retrieved January 4, 2012.
  63. ^ Jakob Nielsen (January 9, 2006). "Search Engines as Leeches on the Web". useit.com. Archived from the original on August 25, 2012. Retrieved May 14, 2007.
  64. ^ Graham, Jefferson (August 26, 2003). "The search engine that could". USA Today. Archived from the original on May 17, 2007. Retrieved May 15, 2007.
  65. ^ Greg Jarboe (February 22, 2007). "Stats Show Google Dominates the International Search Landscape". Search Engine Watch. Archived from the original on May 23, 2011. Retrieved May 15, 2007.
  66. ^ a b c Mike Grehan (April 3, 2006). "Search Engine Optimizing for Europe". Click. Archived from the original on November 6, 2010. Retrieved May 14, 2007.
  67. ^ "Germany search engine market share 2024". Statista. Retrieved January 6, 2025.
  68. ^ Jack Schofield (June 10, 2008). "Google UK closes in on 90% market share". Guardian. London. Archived from the original on December 17, 2013. Retrieved June 10, 2008.
  69. ^ "UK search engines market share 2024". Statista. Retrieved January 6, 2025.
  70. ^ "China search engines market share 2024". Statista. Retrieved January 6, 2025.
  71. ^ cycles, This text provides general information Statista assumes no liability for the information given being complete or correct Due to varying update; Text, Statistics Can Display More up-to-Date Data Than Referenced in the. "Topic: Search engines in South Korea". Statista. Retrieved January 6, 2025.
  72. ^ "South Korea: main service used to search for information 2024". Statista. Retrieved January 6, 2025.
  73. ^ "Most popular search engines in Russia 2023". Statista. Retrieved January 6, 2025.
  74. ^ Arora, Sanjog; Hemrajani, Naveen (September 2023). "A REVIEW ON: MULTILINGUAL SEARCH TECHNIQUE". International Journal of Applied Engineering & Technology. 5 (3): 760–770 – via ResearchGate.
  75. ^ "SEO Starter Guide: The Basics | Google Search Central | Documentation". Google for Developers. Retrieved January 13, 2025.
  76. ^ "Search King, Inc. v. Google Technology, Inc., CIV-02-1457-M" (PDF). docstoc.com. May 27, 2003. Archived from the original on May 27, 2008. Retrieved May 23, 2008.
  77. ^ Stefanie Olsen (May 30, 2003). "Judge dismisses suit against Google". CNET. Archived from the original on December 1, 2010. Retrieved May 10, 2007.
  78. ^ "Technology & Marketing Law Blog: KinderStart v. Google Dismissed—With Sanctions Against KinderStart's Counsel". blog.ericgoldman.org. March 20, 2007. Archived from the original on May 11, 2008. Retrieved June 23, 2008.
  79. ^ "Technology & Marketing Law Blog: Google Sued Over Rankings—KinderStart.com v. Google". blog.ericgoldman.org. Archived from the original on June 22, 2008. Retrieved June 23, 2008.
[edit]
Listen to this article (22 minutes)
 
Spoken Wikipedia icon
This audio file was created from a revision of this article dated 20 May 2008 (2008-5-20), and does not reflect subsequent edits.

 

 

Local search engine optimization (local SEO) is similar to (national) SEO in that it is also a process affecting the visibility of a website or a web page in a web search engine's unpaid results (known as its SERP, search engine results page) often referred to as "natural", "organic", or "earned" results.[1] In general, the higher ranked on the search results page and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers.[2] Local SEO, however, differs in that it is focused on optimizing a business's online presence so that its web pages will be displayed by search engines when users enter local searches for its products or services.[3] Ranking for local search involves a similar process to general SEO but includes some specific elements to rank a business for local search.

For example, local SEO is all about 'optimizing' your online presence to attract more business from relevant local searches. The majority of these searches take place on Google, Yahoo, Bing, Yandex, Baidu and other search engines but for better optimization in your local area you should also use sites like Yelp, Angie's List, LinkedIn, Local business directories, social media channels and others.[4]

The birth of local SEO

[edit]

The origin of local SEO can be traced back[5] to 2003-2005 when search engines tried to provide people with results in their vicinity as well as additional information such as opening times of a store, listings in maps, etc.

Local SEO has evolved over the years to provide a targeted online marketing approach that allows local businesses to appear based on a range of local search signals, providing a distinct difference from broader organic SEO which prioritises relevance of search over a distance of searcher.

Local search results

[edit]

Local searches trigger search engines to display two types of results on the Search engine results page: local organic results and the 'Local Pack'.[3] The local organic results include web pages related to the search query with local relevance. These often include directories such as Yelp, Yellow Pages, Facebook, etc.[3] The Local Pack displays businesses that have signed up with Google and taken ownership of their 'Google My Business' (GMB) listing.

The information displayed in the GMB listing and hence in the Local Pack can come from different sources:[6]

  • The owner of the business. This information can include opening/closing times, description of products or services, etc.
  • Information is taken from the business's website
  • User-provided information such as reviews or uploaded photos
  • Information from other sources such as social profiles etc.
  • Structured Data taken from Wikidata and Wikipedia. Data from these sources is part of the information that appears in Google's Knowledge Panel in the search results.

Depending on the searches, Google can show relevant local results in Google Maps or Search. This is true on both mobile and desktop devices.[7]

Google Maps

[edit]

Google has added a new Q&A features to Google Maps allowing users to submit questions to owners and allowing these to respond.[8] This Q&A feature is tied to the associated Google My Business account.

Google Business Profile

[edit]

Google Business Profile (GBP), formerly Google My Business (GMB) is a free tool that allows businesses to create and manage their Google Business listing. These listings must represent a physical location that a customer can visit. A Google Business listing appears when customers search for businesses either on Google Maps or in Google SERPs. The accuracy of these listings is a local ranking factor.

Ranking factors

[edit]
Local Online Marketing

Major search engines have algorithms that determine which local businesses rank in local search. Primary factors that impact a local business's chance of appearing in local search include proper categorization in business directories, a business's name, address, and phone number (NAP) being crawlable on the website, and citations (mentions of the local business on other relevant websites like a chamber of commerce website).[9]

In 2016, a study using statistical analysis assessed how and why businesses ranked in the Local Packs and identified positive correlations between local rankings and 100+ ranking factors.[10] Although the study cannot replicate Google's algorithm, it did deliver several interesting findings:

  • Backlinks showed the most important correlation (and also Google's Toolbar PageRank, suggesting that older links are an advantage because the Toolbar has not been updated in a long time).
  • Sites with more content (hence more keywords) tended to fare better (as expected).
  • Reviews on GMB also were found to strongly correlate with high rankings.
  • Other GMB factors, like the presence of photos and having a verified GMB page with opening hours, showed a positive correlation (with ranking) albeit not as important as reviews.
  • The quality of citations such as a low number of duplicates, consistency and also a fair number of citations, mattered for a business to show in Local Packs. However, within the pack, citations did not influence their ranking: "citations appear to be foundational but not a competitive advantage."
  • The authors were instead surprised that geotargeting elements (city & state) in the title of the GMB landing page did not have any impact on GMB rankings. Hence the authors suggest using such elements only if it makes sense for usability reasons.
  • The presence of a keyword in the business name was found to be one of the most important factors (explaining the high incidence of spam in the Local Pack).
  • Schema structured data is a ranking factor. The addition of the 'LocalBusiness' markup will enable you to display relevant information about your business to Google. This includes opening hours, address, founder, parent company information and much more.[11]
  • The number of reviews and overall star rating correlates with higher rankings in the Google map pack results.

Local ranking according to Google

[edit]

Prominence, relevance, and distance are the three main criteria Google claims to use in its algorithms to show results that best match a user's query.[12]

  • Prominence reflects how well-known is a place in the offline world. An important museum or store, for example, will be given more prominence. Google also uses information obtained on the web to assess prominence such as review counts, links, articles.
  • Relevance refers to Google's algorithms attempt to surface the listings that best match the user's query.
  • Distance refers to Google's attempt to return those listings that are the closest the location terms used in a user's query. If no location term is used then "Google will calculate distance based on what's known about their location".

Local ranking: 2017 survey from 40 local experts

[edit]

According to a group of local SEO experts who took part in a survey, links and reviews are more important than ever to rank locally.[13]

Near Me Queries

[edit]

As a result of both Google as well as Apple offering "near me" as an option to users, some authors[14] report on how Google Trends shows very significant increases in "near me" queries. The same authors also report that the factors correlating the most with Local Pack ranking for "near me" queries include the presence of the "searched city and state in backlinks' anchor text" as well as the use of the " 'near me' in internal link anchor text"

Possum Update

[edit]

An important update to Google's local algorithm, rolled out on the 1st of September 2016.[15] Summary of the update on local search results:

  • Businesses based outside city physical limits showed a significant increase in ranking in the Google Local Pack
  • A more restrictive filter is in place. Before the update, Google filtered listings linking to the same website and using the same phone number. After the update, listings get filtered if they have the same address and same categories though they belong to different businesses. So, if several dentists share the same address, Google will only show one of them.

Hawk update

[edit]

As previously explained (see above), the Possum update led similar listings, within the same building, or even located on the same street, to get filtered. As a result, only one listing "with greater organic ranking and stronger relevance to the keyword" would be shown.[16] After the Hawk update on 22 August 2017, this filtering seems to apply only to listings located within the same building or close by (e.g. 50 feet), but not to listings located further away (e.g.325 feet away).[16]

Fake reviews

[edit]

As previously explained (see above), reviews are deemed to be an important ranking factor. Joy Hawkins, a Google Top Contributor and local SEO expert, highlights the problems due to fake reviews:[17]

  • Lack of an appropriate process for business owners to report fake reviews on competitors' sites. GMB support will not consider requests about businesses other than if they come from the business owners themselves. So if a competitor nearby has been collecting fake reviews, the only way to bring this to the attention of GMB is via the Google My Business Forum.
  • Unlike Yelp, Google does not show a label warning users of abnormal review behavior for those businesses that buy reviews or that receive unnatural numbers of negative reviews because of media attention.
  • Current Google algorithms do not identify unnatural review patterns. Abnormal review patterns often do not need human gauging and should be easily identified by algorithms. As a result, both fake listings and rogue reviewer profiles should be suspended.

See also

[edit]

References

[edit]
  1. ^ Brian, Harnish (December 26, 2018). "The Definitive Guide to Local SEO". Search Engine Journal. Retrieved October 1, 2019.
  2. ^ Ortiz-Cordova, A. and Jansen, B. J. (2012) Classifying Web Search Queries in Order to Identify High Revenue Generating Customers. Journal of the American Society for Information Sciences and Technology. 63(7), 1426 – 1441.
  3. ^ a b c "SEO 101: Getting Started in Local SEO (From Scratch) | SEJ". Search Engine Journal. 2015-03-30. Retrieved 2017-03-26.
  4. ^ Imel, Seda (June 21, 2019). "The Importance Of Local SEO Statistics You Should Know "Infographic"". SEO MediaX.
  5. ^ "The Evolution Of SEO Trends Over 25 Years". Search Engine Land. 2015-06-24. Retrieved 2017-03-26.
  6. ^ "Improve your local ranking on Google - Google My Business Help". support.google.com. Retrieved 2017-03-26.
  7. ^ "How Google uses business information". support.google.com. Retrieved March 16, 2017.
  8. ^ "6 things you need to know about Google's Q&A feature on Google Maps". Search Engine Land. 2017-09-07. Retrieved 2017-10-02.
  9. ^ "Citation Inconsistency Is No.1 Issue Affecting Local Ranking". Search Engine Land. 2014-12-22. Retrieved 2017-03-26.
  10. ^ "Results from the Local SEO Ranking Factors Study presented at SMX East". Search Engine Land. 2016-10-07. Retrieved 2017-05-02.
  11. ^ "LocalBusiness - schema.org". schema.org. Retrieved 2018-11-20.
  12. ^ "Improve your local ranking on Google - Google My Business Help". support.google.com. Retrieved 2017-03-16.
  13. ^ "Just released: 2017 Local Search Ranking Factors survey results". Search Engine Land. 2017-04-11. Retrieved 2017-05-02.
  14. ^ "'Things to do near me' SEO". Search Engine Land. 2017-02-13. Retrieved 2017-03-26.
  15. ^ "Everything you need to know about Google's 'Possum' algorithm update". Search Engine Land. 2016-09-21. Retrieved 2017-05-18.
  16. ^ a b "August 22, 2017: The day the 'Hawk' Google local algorithm update swooped in". Search Engine Land. 2017-09-08. Retrieved 2017-10-02.
  17. ^ "Dear Google: 4 suggestions for fixing your massive problem with fake reviews". Search Engine Land. 2017-06-15. Retrieved 2017-07-16.
[edit]

 

Frequently Asked Questions

To find the best SEO company in Sydney, look for a provider with a proven track record of success, transparent reporting, and a clear understanding of your business�s goals. Check reviews, case studies, and client testimonials to ensure you are choosing a reputable partner.

SEO agencies in Sydney typically offer comprehensive services such as keyword research, technical audits, on-page and off-page optimization, content creation, and performance tracking. Their goal is to increase your site's search engine rankings and drive more targeted traffic to your website.

Keyword research helps identify the terms and phrases that potential customers are using to search for products or services. By targeting these keywords in your content, you can improve your visibility in search engine results, attract more qualified leads, and drive higher conversion rates.