Internet Advertising Spaceonline advertising space
Signage and digital advertising algorithm
For the first year in 2016, investments in on-line advertising in France surpassed investments in TV advertising. Nowadays algorithms are playing an ever more important part in the buying of advertising space on web sites and raise many questions of ethics and law. In France, the advertising digitisation sector is now valued at 3.5 billion euro.
While this advertising used to be mainly display ads on websites and the acquisition of Google AdWords, the acquisition of automatic advertising space (called "programmatic buying") has now come about. Internet user profiles are based on web traffic tracks so that their interest in an ad can be predicted at any given moment.
Thanks to the use of algorithm it is therefore possible to compute the value of the advertising space on a page that the visitor is currently visiting in real terms. Using algorithm has the benefit of showing advertisements that fit our interests, but there is a risk of unchecked use. Failure to make these mechanisms work transparently affects the behavior of Internet surfers without them noticing.
In addition, the algorithm sometimes benefits from excessive trust, but its results can be discrimination. Thus the questions of impartiality and moral questions of algorithm arise. Studying morals in this field must be rooted in our comprehension of how we are connected to these new techniques. On the one side, this is about how algorithm are legally protected and, on the other side, about the evolution of the eco-system of electronic advertising.
Faced with these new demands, it would be advisable to concentrate on the algorithm itself and not on the information being handled by setting up test able and controllable devices to avoid damaging outcomes. There is a new dawning revolutionary process revolving around the gathering and handling of information, which has taken on unparalleled proportions and stimulated the development of new goods and technologies.
The growth in the volume and variety of information can be attributed to the creation of linked properties and the strengthening of consumer confidence. Technological developments have enhanced their capacity to act: companies are becoming more and more dependant not only on the produced information, but also on their opinions and must therefore continuously make sure that they have a good electronic name.
Against this background, the EU authorities have launched the reform legislative reform procedure for the protection of privacy. In May 2018, the new General Regulation on the Protection of Privacy and Electronic Communications (GDPR) will come into effect in all member states. The Directive requires greater openness and greater responsibility on the part of those who handle information, on the basis of a lawful enforcement regime, and provides for strict sanctions.
It also reaffirms the right to transfer information and the persons responsible for handling it must guarantee that their activities meet the standard of protecting your information at the stage of the creation of a protected document (privacy by design). GDPR is anxious to govern the algorithms process of dates implied.
There is a general tendency in the advertising industry: all websites, service providers and product providers that use algorithmic content are generally making sure that they do not use it. Instead of hiding the critical roles played by algorithmic functions, they point to customization. The " classical " advertising legislation is founded on the basic rule of obtaining the individual's previous agreement before proceeding with the treatment of their personal details.
This approach to privacy, however, is less pertinent when it comes to the issue of advertising. The information gathered through time-honored market research often includes factual and relatively unpredictable information such as name, date, sex, address information or civil status. However, the information is not always available to the individual. However, the notion of " information " is changing dramatically when it comes to what we call e-commerce.
Within societal networking, datasets are not only fundamental classifications (age, sex, address), but also daily data: what I do, what I hear, etc. Such a new context calls into question the relevancy of the differentiation between personally identifiable and non-personally identifiable information. The consent is therefore imperative in order to use the technologies and the exact way in which the information is used by the person responsible is totally unfamiliar.
Therefore, the issue no longer lies in ex ante approval, but in the automated, anticipatory deduction of the enterprises collecting these statistics. These algorithms reinforce this tendency by duplicating the gathering and use of trite and de-contextualized information that is likely to be used for the specific shaping of persons, and by generating "knowledge" about them on the basis of probability rather than certainty about their own individual and private preferences.
Wouldn't it be more important in this context to study the algorithm they handle and create new information than the information they feed? Demands for accounting, visibility and auditability of algorithm-driven action have become critical to avoid possible exceedances. Two main principals should be the basis of the ethical approach to Algorithmic Processing: the need for transparent testing and the need to set up testing to verify the results of each algorithm in order to avoid possible harm.
On-line platform activity is mainly centred on the choice and categorisation of information as well as goods and service offerings. You develop and enable various alogorithms that affect consumer behavior and the way of thinking of people. Sometimes this adaptation is deceptive as it is predicated on the idea of the engine as we think.
It is not about who we are, but about what we have done and what we have considered. These observations illustrate the need for transparency: the persons concerned by an encryption scheme should first be made aware of the fact that an encryption scheme exists, its meaning, the nature of the information it uses and its purposes, so that they can make a right of action if necessary.
Testing for algorithm? Advertising algebra can result in a differentiated pricing of a good or services, or even in a typology of high-risk insured, in order to determine the premiums according to sometimes illegitimate rules by verifying "sensitive" information. It is not only the gathering and treatment of this information (racial and ethnical origin, politics and religion) that is generally forbidden, but the results of these methodologies can also be discrimination.
The results of the first fully algorithm-based global competition on aesthetics, for example, resulted in the choice of only whites. In order to prevent this kind of misuse, measures must be taken urgently to test the results of the algorithm. Besides the legislative and protective roles of the CNIL, there are also emerging behavioural codes: advertisers from the Digital Advertising Alliance (ACD) have implemented a log displayed by a visual symbol next to a specific ad that tells them how it works.
The Internet user is tired of advertising being perceived as too overpowering. Advertising that has the overall objective of better anticipating our needs for "better consumption" must be done in an environmentally compliant, accountable and ethically sound world. It could be a factor in a new industry revolutions that takes into account basic human and civil liberties, calling on people to take their place and take responsibility for theirs.