iPhone’s AI could be more intelligent than a human being!
Yes, you heard that right. The search feature in Photos app sorts out what you look like when you are wearing a bra.
Just a couple of days ago, Twitter users started to point out that if you search the term “brassiere” on iPhone, it automatically displays all your revealing pictures.
That is both convenient and creepy at the same time and internet is unable to handle this.
A user @ellieewbu who spotted this scandalous function tweeted, “Attention all girls all girls!!! Go to your photos and type in ‘Braissiere’ why are apple saving these and made it a folder!!?!!?”
The public went crazy over it.
This feature seems to have guiltless roots which go back to June 2016 when Apple introduced object detection technology as a part of iOS 10. Afterwards, the smart software was made more accurate with the announcement of iOS11.
However, the discovery raised privacy concerns, with the affronted female users reasonably upset that Apple could be storing their bra selfies deliberately.
Super Model Chrissy Teigen took the ongoing social media haphazard on another level by sharing a screenshot of her “brassiere” search, confirming that the feature is actually real.
“It’s true. If u type in ‘brassiere’ in search of your ‘iPhotos’ it has a category for every b**b or cleavage pic you’ve ever taken. Why ” the celebrity wrote.
Further investigating Christine Teigen said. “I am wondering why that is the word of choice and nothing else works,” Well it is true that the term is not commonly used in the everyday routine.
In another follow-up tweet, the brunette beauty wrote, “Typing food will get you food but p**is won’t get you p**is, and b**bs won’t get you b**bs. Just brassiere. *strokes beard*.”
One male user decided to ring in and remark on her technical judgments.
‘You’re smart Chrissy,’ @jacecraftmiller commented. “You’ve taken these pictures have you not? It recognizes the bras you have on. This is all done on the device.”
Hardly pleased with the belittling reply, Chrissy came back with impatient questions insisting “No socks. No underwear. No pants. No shirts. Just bras.”
On the other hand, some of the audience went on social media to have some fun after this new discovery of “brassiere” folder.
The results were hilarious.
Luckily the algorithms Apple accustoms are executed natively on the device, which means your iPhone does not allow to direct the photos to cloud servers.
As said by the Apple’s official support page, “Photos recognizes scenes or specific objects in your photos so that you can search for things like dogs, mountains, or flowers.”
It is also interesting that items which your mobile phone has not classified do not show up as a search item.
So if there are no unrecognized pictures of gloves, for instance, it won’t appear as a search expression.
Since most of the iPhone folks now know about it, they are showing off items of range from pet’s photos to images of delicious food.
Replying to the original post @annaivanovai wrote, “I have a folder with adult cats.”
Another user posted, “Forget the brasserie thing, they’ve categorized all my food pics under “chow,” and I’m living for it.”
If we talk about known intelligent search ability other than of the iPhone’s, it is Google’s search feature which was released one year before Apple Photos.
‘Google Photos’ embodies advanced machine learning and image recognition that pinpoints people, places, and objects.
While Apple technology is capable of identifying over 4000 items in photos, Google’s system is smarter and offers more features to the user.
The reason people prefer it because it comes free of cost and can store your media at a maximum quality where you have to pay Apple for getting access to premium storage.
But Google’s approach to photos can give you creeps.
Each and every picture you take is stored in a cloud server. That means it can scan through your photos to know what’s going on in your personal life.
The possibilities here go further than an unassuming point of norm about individual privacy, although that will be a cause for some people to shun the service.
Data can leak, or be stolen. Establishments can circulate information without us knowing. Government organizations can spy. Google can choose to snip legal rights which you’ve already given it at any time.
It doesn’t mean to say that such happenings can ever take place, but you can’t rule them out!
We pay for videos too. Click here to upload yours.