Christian school teacher used AI and yearbook photos to make child pornography: Police


Steven Houser Beacon Christian Academy

Steven Houser, inset, was arrested after he allegedly used yearbook photos of students at Beacon Christian Academy in New Port Richey, Florida, where he was a teacher, and morphed them into sexually abusive images. (Houser: Pasco County Sheriff’s Office; Screenshot: Google Maps)

A teacher at a Christian school in Florida is behind bars after authorities caught him with child sexual abuse material generated from artificial intelligence pulled from yearbook photos.

Pasco County Sheriff’s deputies arrested 67-year-old Steven Houser, a third-grade science teacher at Beacon Christian Academy in New Port Richey, on possession of child pornography. Deputies apparently received a tip that Houser was in possession of the sexually abusive material and found that he indeed had two photos and three videos. Those did not feature any students.

However, deputies said later learned he was in possession of AI-generated photos which morphed three students’ yearbook photos onto nude bodies. Deputies do not believe there are any other victims, but anyone who may believe they may be a victim of Houser is asked to contact the Pasco Sheriff’s Office Crime Tips Line at 1-800-706-2488, or report tips online at

This is the second case of a teacher using yearbook photos to make child sexual abuse material that Law&Crime has reported on this week.

Federal prosecutors from the Western District of Kentucky last month indicted 39-year-old Jordan Fautz on charges of distribution of child pornography and obscene visual representation of child sexual abuse. Fautz was a seventh and eighth grade teacher at St. Stephen Martyr Catholic School in Louisville.

FBI agents busted Fautz distributing images on a social networking app of a nude photo collage using the real names of a minor girl between 12 to 15 years old and her mother, authorities said. Fautz allegedly used Photoshop or morphing technology to input nonsexual yearbook photos onto pornographic images and videos. The girl had on a red shirt with the letters SSM, the school’s initials, standing in front of a SSMCardinals school photo backdrop, according to the complaint. Last week, parents and students sued the school and the Catholic Diocese of Louisville, claiming they ignored warning signs about Fautz.

As AI becomes more prevalent, lawmakers and authorities are trying to grapple with the issue. Last month, lawmakers in South Carolina filed two bills that specifically address these morphed images of kids.

“Creators and users are one step ahead of the law,” state Sen. Brad Hutto, a bill sponsor, told Greenville, South Carolina, Fox affiliate WHNS. “These bills give the Attorney General’s office, solicitors, and law enforcement the flexibility they need to investigate and prosecute obscene images of child sexual abuse now and in the future. We have to update our laws to remain effective in prosecuting those who place children in jeopardy.”

Meanwhile other cases are sprouting up. A student at a high school in New Jersey caused an uproar last year when he used a website to morph photos of female classmates into pornographic images, the Wall Street Journal reported. The Washington Post reported in November that these fake images, particularly of teenage girls, are “booming.”

The issue is not only affecting young girls but also celebrities. Pornographic deepfakes of Taylor Swift went viral in late January. Marvel star Xochitl Gomez, 17, and New York Rep. Alexandria Ocasio-Cortez also fell victim as targets of AI-generated pornography.

The American Enterprise Institute says two federal statues ban the use of morphed pornographic images, however the U.S. Supreme Court has yet to uphold the laws.

University of Florida law professor Stacey Steinberg writes that “much must be done” to address the issue, starting with the USSC upholding the laws barring the images to prevent the argument that they could be protected under the First Amendment’s freedom of expression.

“Second, to respond to the harms morphed child pornography causes, states must amend their statutory definitions of child pornography,” she writes. “Lastly, parents must be cognizant of the risk associated with oversharing pictures as these images can be stolen and then used for illicit purposes.”

Have a tip we should know? [email protected]


Leave a Reply

Your email address will not be published. Required fields are marked *