Autocomplete is coming to Gmail: Chrome extension will analyse your inbox to help you type an email almost TWICE as fast


comments

Autocomplete is to blame for millions of nonsensical and sometimes hilarious messages on mobile.

And now the technology – considered both a blessing and a curse - is set to be unleashed on Gmail messages sent using a desktop .

Google has announced that users with the Chrome Complete extension for Gmail will, within the coming days, have the power of predictive text at their fingertips.

Users with the Chrome Complete extension for Gmail could, within the coming days, have the power of predictive text at their fingertips. Google claims that the system will work much in the same way that Swiftkey does, learning how and what a user types in Gmail in order to make accurate predictions

Users with the Chrome Complete extension for Gmail will, within the coming days, have the power of predictive text at their fingertips. Google claims the system will work much in the same way that Swiftkey on mobile does, learning how and what a user types, in order to make accurate predictions

While similar autocomplete systems are available to, for instance, fill in the email address of intended recipients, this technology will work on the text of the email itself.

Google claims the system will work much in the same way that Swiftkey does on mobile - learning what a user types. in order to make sensible predictions.

The Complete extension, which was created by a group of Israeli developers within a few days, uses something known as natural language processing (NLP).

The Complete extension, which was hacked together by a group of Israeli developers within a few days, uses something known as natural language processing (NLP). When a word suggestion is made, the user can accept it by hitting ctrl and spacebar, enter, or tab

The Complete extension, which was created by a group of Israeli developers within a few days, uses something known as natural language processing (NLP). When a word suggestion is made, the user can accept it by hitting Ctrl and Spacebar, Enter, or Tab. Google account login page is pictured

When the technology goes wrong: Autocomplete is to blame for millions of nonsensical and sometimes hilarious messages on mobile

When the technology goes wrong: Autocomplete is to blame for millions of nonsensical and sometimes hilarious messages on mobile

NLP will first perform a study of the user's Gmail account during setup in order to build a database of their typing habits.

It will also look at the previous words in the sentence, and the context of the email, before putting forward a word suggestion.

When a suggestion is made, the user can accept it by hitting Ctrl and spacebar, Enter, or Tab.

Initially. Complete will work only within Gmail but the company behind it, Tel Aviv-based Swayy, plans to launch it for Facebook, iOS and Android in the future.

Shlomi Babluki, one of the developers behind the project, told Wired.co.uk that as the software gets to know the user, it will become bolder with its predictions.

'If our algorithm identifies a match with high probability, it will suggest the following word before you have even started typing it,' he said.

According to Babluki, tests have shown that using the Complete software can reduce the number of keystrokes required to write an message in English by around 35 to 40 per cent.

HOW GOOGLE'S AUTOCOMPLETE REVEALS RACIST AND SEXIST SEARCHES 

Google already uses autocomplete in its search function. As you type, autocomplete predicts and displays queries to choose from.

The search queries that you see as part of autocomplete are a reflection of the search activity of all web users and the content of web pages indexed by Google.

And it can reveal a lot about user activity. For instance, a study last year claimed internet giant Google's search facility 'perpetuates prejudices'.

The investigation from Lancaster University found that results from Google's auto-complete internet search tool produce suggested terms which could be viewed as racist, sexist or homophobic. 

The research revealed high proportions of negative evaluative questions for black people, gay people and males. 

For black people, these questions involved constructions of them as lazy, criminal, cheating and under-achieving.

Gay people were negatively constructed as contracting AIDS, going to hell, not deserving equal rights, having high voices or talking like girls.

The negative questions for males positioned them as catching thrush, under-achieving and treating females poorly. 

A study last year claimed internet giant Google's search facility 'perpetuates prejudices'. Pictured is what happens when you search 'why do men'

A study last year claimed internet giant Google's search facility 'perpetuates prejudices'. Pictured is what happens when you search 'why do men'

 



IFTTT

Put the internet to work for you.

Turn off or edit this Recipe

0 comments:

Post a Comment