Author: mechaneyes

  • Vladimir Ivkovic All Night Long

    Vladimir Ivkovic All Night Long

    Limited Edition · Portland

    I was tipped off to Vladimir Ivkovic by Apiento on the excellent Test Pressing who said his set was at the top of his list from Love International last summer.

    Vladimir is playing the closing Sunday eve set at Nowadays this weekend, so it seems having him on heavy rotation these last weeks turns may prove to have been a wise conditioning move.

    November 10 2023
    Limited Edition in Portland.
    Home on the fringe of night life, family, goodness.
    Gathering No. 70.
    If you know you know.

    https://soundcloud.com/vladimir/vladimirivkovicle70

  • AI-assisted Targeting

    AI-assisted targeting in the Gaza Strip

    Wikipedia

    As part of the Gaza war, the Israel Defense Forces (IDF) have used artificial intelligence to rapidly and automatically perform much of the process of determining what to bomb. Israel has greatly expanded the bombing of the Gaza Strip, which in previous wars had been limited by the Israeli Air Force running out of targets.

    These tools include the Gospel, an AI which automatically reviews surveillance data looking for buildings, equipment and people thought to belong to the enemy, and upon finding them, recommends bombing targets to a human analyst who may then decide whether to pass it along to the field. Another is Lavender, an “AI-powered database” which lists tens of thousands of Palestinian men linked by AI to Hamas or Palestinian Islamic Jihad, and which is also used for target recommendation.

    ‘The Gospel’: how Israel uses AI to select bombing targets in Gaza

    The Guardian · Archived

    Precisely what forms of data are ingested into the Gospel is not known. But experts said AI-based decision support systems for targeting would typically analyse large sets of information from a range of sources, such as drone footage, intercepted communications, surveillance data and information drawn from monitoring the movements and behaviour patterns of individuals and large groups.

    · · ·

    “We work quickly and there is no time to delve deep into the target. The view is that we are judged according to how many targets we manage to generate.”

    Israel is using an AI system to find targets in Gaza. Experts say it’s just the start

    NPR · Archived

    Although it’s not known exactly what data the Gospel uses to make its suggestions, it likely comes from a wide variety of different sources. The list includes things like cell phone messages, satellite imagery, drone footage and even seismic sensors, according to Blaise Misztal, vice president for policy at the Jewish Institute for National Security of America, a group that facilitates military cooperation between Israel and the United States.

    · · ·

    “AI algorithms are notoriously flawed with high error rates observed across applications that require precision, accuracy, and safety,” warns Heidy Khlaaf, Engineering Director of AI Assurance at Trail of Bits, a technology security firm.

    · · ·

    “The nature of AI systems is to provide outcomes based on statistical and probabilistic inferences and correlations from historical data, and not any type of reasoning, factual evidence, or ‘causation,’” she says. “Given the track record of high error-rates of AI systems, imprecisely and biasedly automating targets is really not far from indiscriminate targeting.”

    · · ·

    While Israel’s use of the Gospel to generate a full set of targets may be unique, the nation is hardly alone in using AI to assist in intelligence analysis. The U.S. is actively working with many different kinds of AI to try and identify targets in the field. One suite of AI tools, known as Project Maven, is run through the National Geospatial-Intelligence Agency, which collects massive quantities of satellite imagery.

    · · ·

    Ashley wouldn’t comment on any particular AI tool used by the U.S. intelligence community, but he says often these systems will stitch together multiple layers of AI. Some excel at finding objects in images while others can sort through things like radio transmissions . . . “You know the Russians are doing it, you know the Chinese are doing it,” he says.

    Screenshot of the Wall Street Journal. 
Headline: "Anthropic Dials Back AI Safety Commitments"
Subheadline: "Company says competitive pressure prompts it to pivot away from a more-cautious stance"
Photo of Jared Kaplan, chief science officer of Anthropic holding his hands open and looking downward.

    Anthropic Dials Back AI Safety Commitments

    WSJ Gift Link

    New Scientist screenshot.
Headlilne: Als can't stop recommending nuclear strikes in war game simulations

    AIs can’t stop recommending nuclear strikes in war game simulations

    New Scientist

  • RP2040 Meshtastic Nibble

    Retia.io

    Waves that purr

    Thumbnail of thumb holding the RP2040 Meshtastic Nibble. It's a small green PCB with circutry on the bottom half and a cat face silkscreened on top. At the very top are pokey, triangular ears, an LED and a short, copper coil antenna.
  • Hackers Expose Age-Verification Software Powering Surveillance Web

    L0la L33tz · The Rage

    Three hacktivists tried to find a workaround to Discord’s age-verification software. Instead, they found its frontend exposed to the open internet.

    In 2,456 publicly accessible files, the code revealed the extensive surveillance Persona software performs on its users, bundled in an interface that pairs facial recognition with financial reporting – and a parallel implementation that appears designed to serve federal agencies.

    Persona Identity, Inc. is a Peter Thiel-backed venture

    The software performs 269 distinct verification checks and scours the internet and government sources for potential matches, such as by matching your face to politically exposed persons (PEPs), and generating risk and similarity scores for each individual. IP addresses, browser fingerprints, device fingerprints, government ID numbers, phone numbers, names, faces, and even selfie backgrounds are analyzed and retained for up to three years.

    The program, according to the researchers, performs product analytics and user behavior tracking on a government identity-verification platform, provides real-time user monitoring — every click, every page load — on a FedRAMP platform processing PII and biometrics, and includes financial identity-verification capabilities on the government platform.

  • NYPD: Internet Attribution Management Infrastructure

    NYPD · NYC.gov

    The NYPD disclosure from February 4th:

    The NYPD uses internet attribution management infrastructure, including Ntrepid, to manage digital footprints and allow its personnel to safely, securely, and covertly conduct investigations and detect possible criminal activity on the internet.

    . . .

    The information that is ultimately accessible to NYPD personnel utilizing this equipment is limited to publicly available information or the information that is viewable as a result of the privacy settings, privacy practices, and access limitations of an internet environment (e.g., chatrooms, social media profiles, messaging applications)

  • Mamdani faces first showdown with NYPD – will he risk alienating police?

    Eric Berger · The Guardian

    On 4 February, the NYPD disclosed that it used “internet attribution management infrastructure” from the technology company Ntrepid to “allow its personnel to safely, securely and covertly conduct investigations and detect possible criminal activity on the internet”. In other words, to create the sort of “sock puppet” online identities that Mamdani had once sought to prevent.

    . . .

    Owen, of Stop, also argues that the police could use such a tool to target Black and Latino residents. He pointed to the NYPD’s previous disclosure that if someone “makes a comment such as ‘Happy Birthday’ on the Facebook page of a gang member”, they could be considered a “known associate” and added to its criminal database, according to an inspector general report.

  • EFFecting Change: Get the Flock Out of Our City

    EFF · EFFecting Change Livestream Series

    Join our panel to explore what’s happening as Flock contracts face growing resistance across the U.S. We’ll break down the legal implications of the data these systems collect, examine campaigns that have successfully stopped Flock deployments, and discuss the real-world consequences for people’s privacy and freedom.

    Livestream
    February 19, 2026 – 12:00pm to 1:00pm PST