Acessibilidade / Reportar erro

Towards active seo (search engine optimization) 2.0

Abstract

In the age of writable web, new skills and new practices are appearing. In an environment that allows everyone to communicate information globally, internet referencing (or SEO) is a strategic discipline that aims to generate visibility, internet traffic and a maximum exploitation of sites publications. Often misperceived as a fraud, SEO has evolved to be a facilitating tool for anyone who wishes to reference their website with search engines. In this article we show that it is possible to achieve the first rank in search results of keywords that are very competitive. We show methods that are quick, sustainable and legal; while applying the principles of active SEO 2.0. This article also clarifies some working functions of search engines, some advanced referencing techniques (that are completely ethical and legal) and we lay the foundations for an in depth reflection on the qualities and advantages of these techniques.

Active SEO; Search Engine Optimization; SEO 2.0; search engines


Towards active seo (search engine optimization) 2.0

Charles-Victor Boutet (im memoriam); Luc Quoniam; William Samuel Ravatua Smith

South University Toulon-Var - Ingémédia, France

Correspondence Correspondence: Charles-Victor Boutet, (im memoriam) UFR Ingémédia France

ABSTRACT

In the age of writable web, new skills and new practices are appearing. In an environment that allows everyone to communicate information globally, internet referencing (or SEO) is a strategic discipline that aims to generate visibility, internet traffic and a maximum exploitation of sites publications. Often misperceived as a fraud, SEO has evolved to be a facilitating tool for anyone who wishes to reference their website with search engines. In this article we show that it is possible to achieve the first rank in search results of keywords that are very competitive. We show methods that are quick, sustainable and legal; while applying the principles of active SEO 2.0. This article also clarifies some working functions of search engines, some advanced referencing techniques (that are completely ethical and legal) and we lay the foundations for an in depth reflection on the qualities and advantages of these techniques.

Keywords: Active SEO, Search Engine Optimization, SEO 2.0, search engines

1. INTRODUCTION

With the Web 2.0 era and writable internet, opportunities related to internet referencing have increased; everyone can easily create their virtual territory consisting of one to thousands of sites, and practically all 2.0 territories are designed to be participatory so anyone can write and promote their sites. In addition, search engines such as Google issue guidelines on referencing practices; if these guidelines are not followed, it can lead to the exclusion of a website in the indexing engine or severely degrade its performance in terms of positioning. In general, those who venture off the beaten path are often considered as « black hat SEO » without necessarily being harmful or malevolent in any way. In recent years, a terminology has emerged that attributes a color to referencers based on their practices; such as black, gray or white. Throughout this article we explore new SEO practices that take advantage of Web 2.0 structuralism and illustrate this study with concrete examples and a semiotic analysis of the aforementioned designations. Then we introduce the concept of active SEO 2.0.

2. CONTEXT

2.1 An information society

In his book "La rencontre des mondes", Paul Rasse (2005) introduced the concept of the connectivity revolution; the start of automated information exchange was launched with the invention of the automatic telephone in the 1940s. In the following decades, this automation has become ubiquitous, resulting in an information society in which automated information is produced everywhere and through every act of everyone's mundane daily life as databases are fed and consumer profiles are created as we pass through the checkout lane at the supermarket (Larose et al, 2005). The democratization of high speed internet connections has allowed households to connect to the internet in perpetuity, leaving free access to a new territory in which it has quickly and easily left a footprint.

2.2 A paradigm shift

Web 2.0 has generated a lot of interest even though the "old timers" consider it to be a buzzword invented by internet businessmen in order to sell their customers technologies that are sometimes outdated such as javascript (Quoniam et al., 2008). These new technologies have overwhelmed the internet with innovative uses. From the design of modern computer architecture, a computing resource could or could not offer three rights to the user: reading, writing and execution. Before Web 2.0, writing was limited and there was the "any-to-many" model of information diffusion that refers to a few transmitters and many receivers. Since then, Web 2.0 has been the first to host podcasts, weblogs and wikis. Overall, Web 2.0 contributions allow everyone to relay or publish their information on platforms whether or not they are the main editor. Everyone is both a potential transmitter and receiver, "the historical gap that separates the producer and consumer" (Toffler, 1980) is closing, where a "many-to-many" model offers vectorization of increased information that is accelerated, relayed until the bitter end; developing a "rage to connect" (Ramonet, 2001) now embodied by the buzz effect. The virtual territory is potentially unlimited in size and creates a structure composed of an astronomical number of websites and is one of the new paid territories for strategies of SEO (see below).

2.3 Research on the internet

In France, nearly 90% of internet searches are performed through Google (AT Internet Institute, 2009). A study conducted with a panel of 1,403 users showed that 90% of them, when using a search engine, do not consult with the results beyond the third page (iProspect, 2006) as shown in Figure 1. Other studies show that overall, internet users consult only the first page of results which is typically composed of 10 result entries (Spink et al., 2004). The goal of SEO web sites is important, particularly as we are in a period of a paradigm shift of information communication where a many-to-many model (Quoniam et al., 2008) is implicated and is replacing the mass media and the era of one-to-many (De Rosnay, 2006). Everyone can have worldwide visibility through search engines, a privilege previously reserved for prestigious broadcasters. Also, when someone uses a search engine for a specifically targeted term, it is preferable that our website is well positioned; this is the main objective and goal of using SEO techniques.


As the amount of information on the internet is growing at a delirious rhythm, the use of search engines to find pertinent information becomes increasingly critical. A search engine retrieves pages supposedly relevant to the request of the user while comparing the attributes of the page with other characteristics such as anchor texts and returns the results that best match the query (Svor, 2007). The user sees a list of general results ranked at 10 to 20 URL's per page. Search engine page ranking has proven to be an essential component in determining how users browse the Web (Id). In terms of referencing, being the first (or on the first page) results returned to the user is like 'being in the right spot' while out fishing. The metaphor does not stop there; you can fish with a single line (having a single web page indexed), or go fishing with drift nets in an industrial manner and have 300 pages that monopolize the 300 initial responses returned to a user for a precise search. In the latter case, you are certainly more likely to capture the majority of traffic generated by a keyword. Because Google is by far the most popular search engine, it is the main target of users who attempt to influence the ranking of search results (Bar-Ilan, 2007). Google has become extremely powerful; essentially they decide who should be visible and who is virtually invisible on the Web (Id) in a completely discretionary manner.

2.4 Search engines and cognitive biases: Information overload

The problem is not new: "Historically, a large amount of information has always been a good thing: the information has made the spread of cultures possible,). "Today, the initial benefits taken from technologies like 'search engines' are deteriorated with time (Id.) and now the constant increase of information [...] at the international level [...] poses a problem of how such information will be constructed, combined and processed" (Dou et al., 2003). The information overload and the organization of the information are problems, particularly on the internet.

3. THE VERTICAL MODEL

Most search engines present overabundant results in a linear cascade form called a vertical display. This overabundance "is paralleled by uncertainty; it leads the user to develop protective mechanisms called unconscious biases. The biases are intended to simplify the complexity of reality and to avoid an information overload. This reflex is needed to ensure the mental health of the internaut" (Boutin, 2006). In fact, a visitor can not equally value all of the results obtained from a search engine; simply because all of the results will not be consulted. Most people limit their search to the first few pages (iProspect, 2006), giving great importance to the first documents available and minimizing the importance of the following results (Bar-Ilan, 2007). People build their own version of reality on the basis of information provided by their senses, but this sensory input is diminished by mental processes (Heuer, 1999). In this case, there a type of cognitive rigidity or resistance exists; "the phenomenon whereby people [organizations] will limit their ability to develop alternatives because of cognitive frameworks that constrain it." (Boutin, 2006). If yet another result invalidates the previous ones, there is cognitive dissonance: the individual will search for cognitive balance which, when broken, produces a state of tension, that in turn motivates the individual to pursue a coherent alternative (Vaid et al., 2007). Joule and Beauvois evoke a more general perception of behavioral consistency (Id.) involving the coherence of decisions in the present time with decisions taken in the past. That being said, the theory of cognitive dissonance presents a particularly motivating characteristic that stands out from the theory of consistency (Ibid). Being present among the first search results, and ideally monopolizing them, could be a guarantee of legitimacy and certainty that leads to the capture of targeted traffic on the search query.

4. THE INFORMATION OVERLOAD AS A STRATEGY

Consequently, strategies have been developed for the presentation of vertical data, and strategists in this area, knowingly create an overload of information which is considered by Ignacio Ramonet as "democratic censorship" and the concept of censorship has always been the equivalent of authoritarian power. Censorship signifies the suppression, prohibition, severing and retention of information." (Ramonet, 2001). Contrarily, "democratic censorship [...] is no longer based on the suppression, prohibition, severing and retention of data, but on the accumulation, saturation and excessive information overload "(Id.). In fact the user "is literally asphyxiated, bombarded by an avalanche of data [...] that mobilizes, occupies and saturates their time" (ibid.). This is mostly to create a sort of screen effect where "information hides information" (ibid.). In Figure 2, an example of using the e-commerce site eBay, the seller of many USB keys, instead of using the less expensive Hollandaise´s auction system (a single announcement for N times the same object), the seller made the choice to pay N times the cost of listing to occupy the first pages in search engines.


,

Applied to the search engines of websites, this strategy called "doorway pages" or "satellite pages" are created when a large number of pages are optimized for SEO referencing on the same website, or a constellation (mininet) if the pages in question are distributed over a large number of websites. The idea is to get the pages indexed by this large number in the first results from a search engine for one or more specific search queries as shown in Figure 3. A process, which returns to shops "hard evidence", is to monopolize all the steps of the path which are the most visited by potential customers in the city. We often speak of the « long train and earnings stalls » materialized in this respect, it is much less possible to grab the top of the e-pad that is creating a multitude of electronic storefronts at a ridiculous cost and thus ensuring a good audience. These processes are part of the toolbox of practices known as black hat SEO - we will further discuss the actual validity of this designation.


The creation of constellations and link farms seen hereafter are a form of information overload from search engines and are also a very effective way to get a good indexing in view of the Pagerank1 1 Pagerank: algorithm used by Google to determine the indexing of web pages. principle.

5. EVOLUTION OF NETLINKING STRATEGIES: RECIPROCAL LINKS

In a time of democratization of DSL lines and the Internet, webmasters took use to making bilateral trade links called "reciprocal links". For example, Luc had a site A and Charles site B, Luc created a link towards Charles (from A to B) and Charles returned a link to Luc (from B to A) as described in Figure 4.


6. TRIANGULAR TRADE

In fact, a link to a site acted and still acts like a vote of popularity or relevance in the eyes of Google, it is the basis of PageRank (that is why Google bombing still works). Then Google made the exchange of reciprocal links obsolete ; it was a sort of widespread secret agreement intended to artificially drive up website results in search queries (SERPs2 2 SERP: Search Engine Result Page ). Webmasters quickly found a solution by initiating triangular exchanges as shown in Figure 5. For example, Luc creates a link from his website A towards William's website B, the latter creates a link from website B to Charles website C and closes the circuit by creating a link from website C to website A that belongs to Luc.


The creation of such secret agreements is difficult because it is not just an exchange of mail between two people to establish the structure.

7. MININETS: DIGITAL CONSTELLATIONS

Michael Campbell devised the mininets (or constellations) that he described in his e-book "Revenge of the mininet" in 2003. The principle is to always have a closed circuit of exchanged links between different websites. Campbell conducted a number of tests on structures that he has envisioned. It was apparent that the structure called the "butterfly" as shown in Figure 6 is the most efficient among all those that were tested. The objective is, as in triangular link exchanges, to create a group of interconnected sites. In addition, Campbell would introduce a central site; each site in the group, in addition to voting for the next link, votes for the central site that will receive the best indexing results. The author states that it is optimal to have 6 satellite sites promoting the central site. More is not necessarily better, and instead of creating a mininet of hundreds of satellites around a "main site", it would be preferable to proceed with fractals (each satellite on the first floor and the main site with another mininet of 6 satellites of secondary ranking, etc) (Campbell, 2003). This solves the dilemma of referencing "pyramid or butterfly", because in this case we finally obtain an optimized pyramidal structure. Using this model and while proceeding in fractals, the term constellation makes sense since it is composed of two-story structures containing a total of 259 entities.


8. THE LINKWHEEL: THE MININET 2.0

One of the applications derived from Web 2.0 is the ability to create blogs through various hosts with the willingness and opportunity to create a giant constellation as we have done to promote a website with the keyword "competitive intelligence" (Figures 8 and 9). The other being an opportunity offered by the Digg-like3 3 Digg-Like: Web 2.0, always from the aspect of its writability, has permitted the advent of such systems: a type of platform that proposes establishing a ranking of websites based on the most votes attributed by users. The articles or websites that have the most success are placed on a page, which in the eyes of Google and other search engines, has good visibility and good credibility since it has won the popular vote. This is the principle of platforms like Digg, Wikio, and the CMS programs like Pligg, PHPDug and Scuttle. It is enough for the author of a constellation to find an automated system that automatically votes for its member sites and, ultimately, these sites will be placed high on the list for key words chosen by the designer. It is a form of social bookmarking. that are actually 2.0 directories. The Digg was supposed to be like the manifestation of a popular expression (through submission of supposedly relevant articles and a referendum of votes for those items / websites), followed by submissions to these directories (wikio, technorati) implying that the subject URLs would be relevant, and indeed, go up in the SERPs. A fortiori, these structures that create mininets then become linkwheels as in Figure 7.



9. ACTIVE SEO 2.0

9.1 Definition of active SEO 2.0

Because of its writable orientation, Web 2.0 offers excellent referencing potential and it is possible to take part without violating the law. Just creating a digital constellation and optimizing the structure to obtain good indexing. The keystone lies in automation; where a cognitive limitation of the internaut plays in our favor; the drive limits prevent, or at least slow down further development. Since we are not violating any law, there is no impediment in using techniques that enable us to ensure to attract the greatest audience and there is no reason to see any dark side, neither gray nor black hat SEO. We create our own virtual territory as shown in Figure 8. The active SEO 2.0 can be defined as a set of postures aimed at achieving maximum effectiveness, which goes beyond the acclaimed voting techniques of search engines without breaking the law.

9.2 Active SEO applied to intelligence

To illustrate the potential of active SEO, we have referenced the site http://quoniam.univ-tln.fr with the term "competitive intelligence". Originally, this site was located at a distance of several thousand ranks away from the first place in a Google search query. In less than two weeks, we succeeded in elevating its placement within the first twenty places in ten or so google´s search engines in various countries including the top ten in France and Google.com as shown in Figure 9. Three months after this active SEO campaign, the page in question remains stable in these positions.


10. STRUCTURALISM OF THE ACTIVE SEO 2.0

10.1 Systematic Approach

Web 2.0 has created numerous potentials, it is still necessary to detect them. When one refers to the internet, we talk of virtual territory. As it is virtual, its geography, its limits and its extent is varied. Everything acts as a potential showcase, a traffic generating resource, for sales and therefore economy and it is possible for anyone to create a broad audience acting as a net whose size is unlimited. This large extent, whether it takes the form of a constellation or a gigantic interconnection of websites, will be interpreted by the judges (search engines) as a set of independent actors voting for one another. Search engines will become increasingly effective in properly referencing such systems. Beyond the economic aspect, we believe that it is the potential to disseminate information on a massive scale, to have a display rack and unlimited diffusion where the initiator has control. This is a direct application of the "Pronetariat" or "Net" revolt as understood by Joel de Rosnay (De Rosnay, 2006).

10.2 Automation of the vectorization

In his book "the 3rd wave", Alvin Toffler studies the major change between the first and second wave, between the impact of agriculture and that of the industrial age, the passage of the "living cell"; human and non-human energes that carry productivity beyond human capabilities (Toffler, 1980). We are now in a situation where every individual can transmit information globally, which in itself is a major paradigm shift.

10.3 The last barrier

Essentially, to use the opportunities of writable web is to write back on a resource; blogs, forum, wiki, podcast, and first of all complete a form showing that you're not a robot, solving a Captcha4 4 Captcha: Reverse Turing test under visual form: a controller, software, tests if the interactor is a human or an automaton itself by presenting an alphanumeric string of hidden characters in an image that is supposedly untreatable by automated systems such as optical character recognition systems, and asks to copy the contents into the appropriate reply box. as shown in Figure 10.


In this case, the captcha is the last barrier or fortress that prevents the internaut from rejoining what appeared to be an exclusivity in the past; the opportunity for a transmitter to send information to a multitude of receptors. The introduction of these systems is a rare form of organization as described by Joël de Rosnay (De Rosnay, 2006). Allocating the right to write a resource has become the norm. In the United States, 57% of teenagers create online content (Lenhart, 2005). In France, 68% of respondents reported contributing to at least one type of Web 2.0 platform (SOFRES, 2007). During 2006, 35,000 videos were posted daily on YouTube (Wired News, 2006). Before Web 2.0, permission to write was granted to e-mail systems. This law gave birth to the spam e-mail phenomenon i.e. the automated sending of information from an individual to millions of receivers. With the new practices of Web 2.0, spamming has emerged in forums, weblogs and chatrooms. As various breeding grounds for both advertising and sowing trackbacks that act as voting in terms of relevance to the page or pages.

10.4 Industrialization of e-griculture

Today we find ourselves in a context of the economy of information that moreover deepened the bed of a new semantics based on an endangered mining industry. "Today, information replaces coal, it speaks of 'textmining' of 'data mining', as if described in words, after the end of "black faces", the advent of a new Eldorado" (Bulinge, 2002). The constellations enable the generating of potential indexing and traffic that constitutes the fuel necessary for any website. E-griculture constitutes, according to Toffler, the second wave5 5 The Toffler Wave: According to Toffler, it is possible to synthesize and consolidate changes, the innovations in meta-entities that sustainable impact humanity on a global scale for example: agriculture and industrialization. Finally, to have a vision enabling a comprehensive understanding of the mechanisms involved in such a phenomenon. that has permitted the formalization of this activity. Subsequently, the industrial era, the second wave has enabled the automation of agriculture, which adds another dimension to this sector. Similarly, web 2.0, thanks to its writable aspect, has permitted a change in magnitude of various forms of culture; that, without automation, could not be as effective. Link farming6 6 Link farming: growing links, such as mininets: the result of digital constellations whose fruit is a potential for optimization, visibility allowing the capture of the audience. , PR farming7 7 PR farming: cultivation of pagerank, similar to link farming. , link baiting8 8 Link baiting: practice of luring a visitor to promote a link. It requires creating interesting or useful content that the internaut will find relevant for voting, either by making a trackback from your own site to content in question, or sharing it with others through forums or social bookmarking sites. . All practices that allow gaining control of the top spots on search engines and receive maximum visibility. Beyond information, the simple visit of an internaut to a website represents a resource because website traffic is valuable. In this sense, the automation of targeting information can fool the judges, in the case of search engines, which will cautiously allocate certain importance to websites. Since the arrival of Web 2.0, old methods have been revived, for example mininets that become linkwheels and produce fruit: the traffic and potential indexing.

10.5 The challenges of SEO

Search engines are daily crossing points for millions of Internet users. It is therefore a preferred destination for advertising media as shown in Figure 10.


There are simple solutions to appear at the top of the results page for a keyword: search engine paid advertising. It is therefore quite clear that using SEO techniques capable of achieving these same results can be seen as a loss to these search engines. As for the argument that says SEO techniques strictly come out of advice provided by the search engines and alter the relevance of the research results, simply refer to Figure 10. Another argument to discourage people tempted by these techniques is that it is impossible to reach the very top of google listings by using them (see Figure 9), or that these results are not sustainable (results in Figure 9 were stabilized for 3 months despite cessation of the SEO campaign). Finally, any person in violation of the SEO guidelines issued by the search engines will be branded as black hat SEO.

11. BLACK HAT SEO:

11.1 Critical Interpretation

Literally, the term can be translated as "SEO Research" (as defined by indexing websites, SEO means "search engine optimization") with the application of the black hat style. In computer sciences, the term "black hat" (as opposed to the term" white-hat ") refers to westerns films (where the good guy wears the white cowboy hat and the bad guy or villain wears the black cowboy hat) and is a dualistic choice of symbols representing light against darkness and the white against the black are symbols with eloquence far beyond that of our Judeo-Christian culture. The Yin and Yang that already appear in the Tao te Ching, are the perfect example. The name is likely to be widely perceived as an insult, expressing a negative connotation or as distasteful.

11.2 Epistemology

For the company Google and referencers, this term covers all of the processes that bias or unfairly influence search engine algorithms. For others, it rejoins the nuisances of spam, unsolicited messages and commentaries in what Matt Cutts harshly refers to as "devastating or burning" techniques (Cutts, 2008). Globally, we remark that the term encompasses a wide range of practices, most of which aim at profits. Thus, the forums dedicated to black hat SEO, mingle with people that encourage the indexing their website by any means, but also individuals who are seeking profits illegally through the hacking of websites or credit cards and it is not surprising that this activity creates a bad image.

11.3 Historic Anchorage

Globally, black hat SEO involves using unconventional techniques to improve the indexing of websites. The choice of the term "Black hat" refers to the world of hacking and more generally the term "hacker". This name, widely used in the media, has taken root in the collective unconscious as a synonym for a computer thug. In the U.S., people like Kevin Mitnick have become synonymous with the name "hacker" and many of the articles and documentaries on the matter were devoted or associated to him. In France, the vulgarization of the term occurred at the time of the discovery of "carding" (production of fraudulent bank cards) by the public and the fraudulent business of Serge Humpich (Humpich, 2001). Today, this attribute sounds like an argument of informal fallacy as, FOX 11 recently evoked the "Anonymous" movement as "hackers on steroids" (FOX 11, 2008). The use of the term in the media approaches, in reality, the term "cracker" which would actually be more appropriate. Originally the term hacker referred to bored MIT students who, for example, would repair the printers themselves. For Pekka Himanen, the term encompasses more than boredom and goes beyond the concept of passion. Thus, we can be a hacker in our everyday life even while gardening (Himanen et al., 2001). Referencing professionals were smart to use the designation: "black hat SEO". This profession allows the rapid obtention of a good reference and it represents a serious threat in this area, while "white-hat" practitioners strictly comply with search engines recommendations. If they expect their competitors to comply with "white-hat" technicalities, we can say that we have arrived at a time where the indexing of websites is a crucial asset, and that if everyone works with equal strategies, any new positioning of a keyword will already be the subject of greater competition. Simple "white hat" referencing techniques will make it difficult to find a place among the first results. Without alternative techniques, it is not easy for a newcomer to be properly indexed. Discrediting means to be surpassed and is not without interest. The subject is no exception to the trend of "ambient binarism" (Wield, 2008) for everything related to the web "between curse and idolatry"(Id). For fear of being treated as thugs, professional SEO referencers opted for the gray hat SEO which means: it is not "black hat", not a bad guy, but not limited to techniques known as "white hat" (which everyone knows the limited scope).

12. CONCLUSION

Currently, SEO techniques do not conform strictly to recommendations issued by search engines and are being smeared with bad press because they are considered illegal or ineffective. Our work demonstrates the contrary, using the appropriate techniques; it is possible for a single individual to position a website on the forefront of search engine results in a sustainable manner without being illegal: this is active SEO 2.0 in action. If for the moment, such techniques are being discredited, we believe in the capacity of active SEO 2.0 and the paradigm of "many-to-many" information diffusion. It is neither more nor less than the available potential for each individual to communicate globally with the same opportunity as large structures; by uniquely using the potential that this new technology offers for creating digital constellations. We further observe that such potential is greatly facilitated by the vertical model that is adopted by the vast majority of search engines.

Luc Quoniam

Professor of Universities in Sciences of Information and the Communication (71st section of the French National Council of Universities, CNU)

PhD in Science de l'Information et de la Communication at Université Aix Marseille III

Université du Sud -Toulon - Var, Ingémédia

Avenue de l'Université - BP20132

FRANCE 83957 - La Garde CEDEX, - França

Telefone: (33) 870405651

URL da Homepage: http://quoniam.info/

Manuscript first received 16/03/2011

Manuscript accepted 10/07/2012

* Carding: Refers to the piracy of credit cards, bank accounts and phones, usually in the smartcard format (microchip cards)

Erratum

This is an Erratum for the article 2012 JISTEM V.9, n. 3 2012. Author William Samuel Ravatua Smith was included.

The full text is available in http://www.jistem.fea.usp.br/index.php/jistem/article/view/10.4301%252FS1807-17752012000300001/330

DOI: 10.4301/S1807-17752012000300001

  • Anonymous. (2008). from http://www.youtube.com/watch?v=kkAngvkWVkk &feature=youtube_gdata: FOX 11 News Report.
    » link
  • Bar-Ilan, J. (Dec de Aug de 2007). Manipulating search engine algorithms: the case of Google. Journal of Information, Communication and Ethics in Society, 2/3, pp. 155 - 166.
  • Boutin, E. (2006). Biais cognitifs et recherche d'information sur internet. Quelles perspectives pour les indicateurs de pertinence des moteurs de recherche. Actes du colloque VSST. Lille, France.
  • Bulinge, F. (2002). Pour une culture de l'information dans les petites et moyennes organisations : un modèle incrémental d'intelligence économique. France: Université du Sud Toulon Var.
  • Campbell, M. (2003). revenge of the mininet. from http://www.ojdl.com/Ebooks/seo- ebooks/revenge-of-the-mininet.pdf
  • Cutts, M. (2008). Preventing Blight.Google.
  • Dou H., B. E. (2003). De la création es bases de données au développement des systèmes d'intelligence pour l'entreprise. ISDM
  • Himanen, P. (2001). L'Ethique Hacker et l'Esprit de l'ère de l'information. Exils
  • Humpich, S. (2001). Le Cerveau bleu. Xo Editions.
  • Institute, AT Internet. (Avril, 2009). Baromètre des moteurs( from http://www.atinternet-institute.com/fr-fr/barometre-des-moteurs/barometre-des- moteurs-avril-2009/index-1-1-6-170.html ed). France: Institute, AT Internet.
  • IProspect. (2006). Search engin user behavior study. from http://www.iprospect.com/premiumPDFs/WhitePaper_2006_SearchEngineUser Behavior.pdf
  • Larose, D. (2005). Des données à la connaissance : Une introduction au data-mining. Vuibert.
  • Maniez, D. (2008). Les dix plaies d'Internet : Les dangers d'un outil fabuleux. Dunod.
  • Quoniam, L. B. (2008). web 2.0, la révolution connectique. Document numérique 11, no. 1-2: 133-143.
  • Ramonet, I. (2001). La Tyrannie de la communication. Gallimard.
  • Rasse, P. (2005). La rencontre des mondes : Diversité culturelle et communication. Armand Colin.
  • Rosnay, J. d. (2006). La révolte du pronétariat : Des mass média aux média des masses. Fayard.
  • Spink, A. J. (2004). Web Search: Public Searching On The Web. New York: Springer-Verlag.
  • Svore, K. Q. (2007). Improving web spam classification using rank-time features. ACM, pp. 9-16.
  • TNS Sofres. (2007) Marques et Web 2.0 : mythes et réalités.France from http://www.tns-sofres.com/_assets/files/221007_web20.pdf
  • Toffler, A. (1980). La 3ème vague. Denoël.
  • Wired news. (2006) Now Starring on the Web: YouTube. from http://www.wired.com/techbiz /media /newsl2006/04170627: wircd news
  • Correspondence:
    Charles-Victor Boutet, (im memoriam)
    UFR Ingémédia
    France
  • 1
    Pagerank: algorithm used by Google to determine the indexing of web pages.
  • 2
    SERP: Search Engine Result Page
  • 3
    Digg-Like: Web 2.0, always from the aspect of its writability, has permitted the advent of such systems: a type of platform that proposes establishing a ranking of websites based on the most votes attributed by users. The articles or websites that have the most success are placed on a page, which in the eyes of Google and other search engines, has good visibility and good credibility since it has won the popular vote. This is the principle of platforms like Digg, Wikio, and the CMS programs like Pligg, PHPDug and Scuttle. It is enough for the author of a constellation to find an automated system that automatically votes for its member sites and, ultimately, these sites will be placed high on the list for key words chosen by the designer. It is a form of social bookmarking.
  • 4
    Captcha: Reverse Turing test under visual form: a controller, software, tests if the interactor is a human or an automaton itself by presenting an alphanumeric string of hidden characters in an image that is supposedly untreatable by automated systems such as optical character recognition systems, and asks to copy the contents into the appropriate reply box.
  • 5
    The Toffler Wave: According to Toffler, it is possible to synthesize and consolidate changes, the innovations in meta-entities that sustainable impact humanity on a global scale for example: agriculture and industrialization. Finally, to have a vision enabling a comprehensive understanding of the mechanisms involved in such a phenomenon.
  • 6
    Link farming: growing links, such as mininets: the result of digital constellations whose fruit is a potential for optimization, visibility allowing the capture of the audience.
  • 7
    PR farming: cultivation of pagerank, similar to link farming.
  • 8
    Link baiting: practice of luring a visitor to promote a link. It requires creating interesting or useful content that the internaut will find relevant for voting, either by making a trackback from your own site to content in question, or sharing it with others through forums or social bookmarking sites.
  • Publication Dates

    • Publication in this collection
      08 May 2013
    • Date of issue
      Dec 2012

    History

    • Received
      16 Mar 2011
    • Accepted
      10 July 2012
    TECSI Laboratório de Tecnologia e Sistemas de Informação - FEA/USP Av. Prof. Luciano Gualberto, 908 FEA 3, 05508-900 - São Paulo/SP Brasil, Tel.: +55 11 2648 6389, +55 11 2648 6364 - São Paulo - SP - Brazil
    E-mail: jistemusp@gmail.com