Thursday, June 20, 2024

Luca Debus and his GoCreator pin





This is Luca Debus of "Wannabe" with his GoCreator pin of immeasurable value and worth!

"I finally got the pin! It arrived this morning. I had to pay 10 bucks for silly German customs.... but it's totally WORTH IT!"

We think it's worth it, too, Luca. We are so happy that you are a member of our little GoCreator group. 


#FB00860

Monday, June 17, 2024

Toe profiles...

Which of above resembles your foot, dearest FA reader? Let's do an unscientific survey. 

 #FB00859

My Mill Grinds...


#FB00858

Wednesday, June 12, 2024

Donuts glossary

f



 #FB00857

Deep Fakes using Artificial Intelligence

 

AI-generated audio and video are being used to trick businesses and individuals

Before your trip begins, take steps to protect against fraud and identity theft

The person at the other end of that video call certainly looks and sounds legitimate. Maybe it’s someone you’ve bonded with on a dating site, or maybe it’s a semi-distant relative or remote work colleague. Yes, it’s odd that they’re asking you to send them money or provide sensitive personal information, but you trust them.

Just one problem: They’re not real. Their image and voice have been generated through artificial intelligence (AI), and are being controlled behind the scenes by a scammer. What you’re experiencing is a deepfake, a rapidly evolving technology often used for malicious acts.

The U.S. Government Accounting Office (GAO) defines a deepfake as video, photography, or audio that "seems real but has been manipulated with AI. The underlying technology can replace faces, manipulate facial expressions, synthesize faces, and synthesize speech."

More and more criminals are using AI deepfakes to commit identity fraud or pry money and data from businesses and individuals. The digital verification platform Sumsub reported an astonishing 1740% jump in deepfake-related fraud attempts in North America between 2022 and 2023.

How deepfakes are being used in scams

By creating a deepfake persona, fraudsters can trick people into believing they’re interacting with someone they know or want to know. This builds trust, making it easier for the scammer to manipulate the victim. Cybercriminals can also utilize deepfakes to create compromising material for the purpose of extortion. They can use an AI bot to, for example, take a brief snippet of a person’s real voice and “clone” it to produce an authentic-sounding facsimile; the faked voice can then be made to say just about anything.

The majority of deepfake fraud cases thus far have targeted businesses. Even large global companies can fall for these scams: In one recent example, an employee at a multinational design and engineering firm was tricked by a deepfake video call into transferring $25 million of the company’s funds to fraudsters. Many bad actors, meanwhile, are using deepfake audio and video in attempts to gain access to company data, which could result in breaches of customer information.

As this technology grows more sophisticated, it’s also getting easier to use—which means it’s becoming increasingly popular as a method to defraud individuals. Deepfakes have already made their way into the world of romance scams, according to a recent report in Wired. The article described how a crew of scammers used "deepfakes and face-swapping to ensnare victims in romance scams, building trust with victims using fake identities, before tricking them into parting with thousands of dollars."

Read more here.

Don't get scammed!

#FB00856

Monday, June 10, 2024

Take the fork ahead...


 I wanted to say that there was a fork in the road, but the fork isn't really in the road.

#FB00855

Monday, June 3, 2024

John Wayne (1930)

Age 23.

#FB00854

Saturday, June 1, 2024

Merkins for sale



Pubic wigs (called Merkins) were worn by prostitutes in the 1450s.

The reason for this strange accessory was that pubic hair was considered popular and attractive, but sex workers shaved their lower parts to avoid pubic lice (annoying) and used merkin to cover up sexually transmitted diseases from their clients, such as syphilis. 

In Hollywood film production, merkins may be worn by actors and actresses to avoid exposing genitalia during nude or semi-nude scenes.

The presence of the merkin protects the actor from inadvertently performing "full frontal" nudity – some contracts specifically require nipples and genitals to be covered in some way – which may help ensure the film achieves a less restrictive MPAA rating.

The Oxford English Dictionary dates the first written use of the term to 1617.

The word probably originated from malkin, a derogatory term for a lower-class young woman, or from Marykin, a pejorative way of saying the female name Mary.

Thanks to 6turtle9 for this post.

#FB00853

Is anyone else wondering if that guy is wearing one of his own merkins on his face?

February 31, 1869

 #FB00892