Trending Spy News
Dropbox’s AI tools can help you find your stuff — from everywhere on the internet




Dropbox is launching two different but related AI-powered services into its platform. The first is simple and obvious: a tool for summarizing and querying documents. This is neat and useful and the sort of feature you’ll see in most tools in this category over time.
The other thing Dropbox is launching is much more ambitious and interesting. It’s a universal search engine that can access your files in Dropbox but also across the entire web. It’s called Dash and comes from Dropbox’s 2021 acquisition of a company called Command E. The idea behind Dash, Dropbox CEO Drew Houston tells me, is that your stuff isn’t all files and folders anymore, and so Dropbox can’t be, either. “What used to be 100 files or icons on your desktop,” he says, “is now 100 tabs in your browser, with your Google Docs and your Airtables and Figmas and everything else.” All the tools are better, but they resist useful organization. “So you’re just like, okay, I think someone sent that to me. Was it in an email? Was it Slack? Was it a text? Maybe it was pasted in the Zoom chat during the meeting.” Dash aims to be the “Google for your personal stuff” app that so many others have tried and failed to pull off.
The Dash app comes in two parts. There’s a desktop app, which you can invoke from anywhere with the CMD-E keyboard shortcut, that acts as a universal search for everything on your device and in all your connected apps. (If you’ve ever used an app like Raycast or Alfred as a launcher, Dash will look very familiar.) There’s also a browser extension, which offers the same search but also turns your new tab page into a curated list of your stuff. One section of the Dash start page might include the docs Dropbox thinks you’ll need for the meeting starting in five minutes; another might pull together a bunch of similar documents you’ve been working on recently into what Dropbox calls a “Stack.” You can also create your own stacks, and as you create files and even browse the internet, Dash will suggest files and links you might add.
The term “stacks” is important, by the way. Dropbox has been a files-in-folders company since it was founded in 2007 and is making a conscious break with that paradigm as it leans into all things AI. “There’s no real container that can hold a Google Doc and an Excel spreadsheet and a 10-gig 4K video,” Houston says, and the old organizational systems break down even further as the platform begins to learn that all three of those things are about your house renovation project, and hey, here are some other documents about that project too!
Could you just call all that… a folder? Sure! But the way Dropbox sees it, the concept of folders has so much history that it’s getting in the way. “Folks are looking for an increased kind of flexibility,” says Devin Mancuso, Dropbox’s director of product design, “or when it comes to tabs and apps, they’re thinking about grouping and arranging those in slightly different ways.” You can have a file in multiple stacks, just to name one example, which doesn’t work in a folders world. Houston and Mancuso both compare stacks instead to Spotify playlists in that they’re a mix of personally curated and algorithmically enhanced. Losing the f-word is both a practical design and a philosophical one.


When Houston gave me a demo of Dash working on his own account, his new-tab page pulled up both a bunch of information about me and The Verge (presumably tied to the calendar event that included us both) and built an automated stack of documents related to the planning offsite he and his executives were in the midst of that week. “It’s such a basic concept, right?” he says, mousing around in his browser. “Search that actually works, a collection concept for links and files and any kind of cloud content, bringing machine intelligence into the experience — it’s more of a self-organizing Dropbox. Not everyone has to be their own librarian, filing things away.”
This is, of course, not a new or unique idea. The idea of cross-platform, universal search for your personal data and documents has been around practically as long as the internet. Large language models can definitely make that search more powerful, which is why companies like Mem and Rewind and even Google have been investing in it in big ways.
Houston readily acknowledges that Dropbox isn’t the first company to have this idea, but he thinks Dropbox has one big advantage over most of its competitors in this space: it already has plenty of users and companies uploading all their most important and most sensitive stuff to the platform. Integrating with the Figmas and Airtables of the world is a much easier problem, in some ways, than getting access to your existing file system. “It’s a very natural extension,” Houston says, “to be like, ‘We started with your files, but now we support everything else. Maybe we should have been supporting everything else for a long time.’”
The big question, for Dropbox and everyone else working on this, is security. Here, too, Houston thinks Dropbox has a leg up. “Nobody wants their stuff to be chopped up into little pieces and fed into some kind of advertising machine,” he says. “So the fact that Dropbox is a fundamentally private service, the fact that we’re subscription, the fact that our incentives are aligned, it all helps.” Especially with all your data in the cloud, there are still plenty of questions about how data is accessed, who can see what, how personalized various systems should be, and much more.
As of today, Dropbox AI available to all Pro customers and a few teams, and there’s a waitlist to get into the Dash beta as well. The next phase for Dropbox, Houston says, is to learn what people want and how they use the products. He says he’s happy to be somewhat conservative at first in the name of not making huge mistakes — you really can’t have an AI hallucinating information out of your most sensitive work docs — but he sees this stuff getting better fast.
In general, Dropbox has been thinking about AI integrations for a long time. It’s one of a class of what you might call work-about-work companies, along with Asana, Slack, and others; they’re not the tools you use to get stuff done — they’re the tools for keeping your files in order and your team in sync and your life together. For all these companies, step one was making it easier to manage everything. But that always implied a step two: teach the things to manage themselves. “In the physical world,” Houston says, “the equivalent is to just imagine you have all these papers on your desk, and they’re neatly sorting themselves into piles. That’s great. That’s what we’re building.”
Trending Spy News
FTC investigating OpenAI on ChatGPT data collection and publication of false information




The Federal Trade Commission (FTC) is investigating ChatGPT creator OpenAI over possible consumer harm through its data collection and the publication of false information.
First reported by The Washington Post, the FTC sent a 20-page letter to the company this week. The letter requests documents related to developing and training its large language models, as well as data security.
The FTC wants to get detailed information on how OpenAI vets information used in training for its models and how it prevents false claims from being shown to ChatGPT users. It also wants to learn more about how APIs connect to its systems and how data is protected when accessed by third parties.
The FTC declined to comment. OpenAI did not immediately respond to requests for comment.
This is the first major US investigation into OpenAI, which burst into the public consciousness over the past year with the release of ChatGPT. The popularity of ChatGPT and the large language models that power it kicked off an AI arms race prompting competitors like Google and Meta to release their own models.
The FTC has signaled increased regulatory oversight of AI before. In 2021, the agency warned companies against using biased algorithms. Industry watchdog Center for AI and Digital Policy also called on the FTC to stop OpenAI from launching new GPT models in March.
Large language models can put out factually inaccurate information. OpenAI warns ChatGPT users that it can occasionally generate incorrect facts, and Google’s chatbot Bard’s first public demo did not inspire confidence in its accuracy. And based on personal experience, both have spit out incredibly flattering, though completely invented, facts about myself. Other people have gotten in trouble for using ChatGPT. A lawyer was sanctioned for submitting fake cases created by ChatGPT, and a Georgia radio host sued the company for results that claimed he was accused of embezzlement.
US lawmakers showed great interest in AI, both in understanding the technology and possibly looking into enacting regulations around it. The Biden administration released a plan to provide a responsible framework for AI development, including a $140 million investment to launch research centers. Supreme Court Justice Neil Gorsuch also discussed chatbots’ potential legal liability earlier this year.
It is in this environment that AI leaders like OpenAI CEO Sam Altman have made the rounds in Washington. Altman lobbied Congress to create regulations around AI.
Trending Spy News
OpenAI will use Associated Press news stories to train its models




OpenAI will train its AI models on The Associated Press’ news stories for the next two years, thanks to an agreement first reported by Axios. The deal between the two companies will give OpenAI access to some of the content in AP’s archive as far back as 1985.
As part of the agreement, AP will gain access to OpenAI’s “technology and product expertise,” although it’s not clear exactly what that entails. AP has long been exploring AI features and began generating reports about company earnings in 2014. It later leveraged the technology to automate stories about Minor League Baseball and college sports.
AP joins OpenAI’s growing list of partners. On Tuesday, the AI company announced a six-year deal with Shutterstock that will let OpenAI license images, videos, music, and metadata to train its text-to-image model, DALL-E. BuzzFeed also says it will use AI tools provided by OpenAI to “enhance” and “personalize” its content. OpenAI is also working with Microsoft on a number of AI-powered products as part of Microsoft’s partnership and “‘multibillion dollar investment” into the company.
Announcing partnership with @AP — we’ll help them thoughtfully explore use-cases for our technology, we’ll work with their content in our systems: https://t.co/3lAqzfCF5P
— Greg Brockman (@gdb) July 13, 2023
“The AP continues to be an industry leader in the use of AI; their feedback — along with access to their high-quality, factual text archive — will help to improve the capabilities and usefulness of OpenAI’s systems,” Brad Lightcap, OpenAI’s chief operating officer, says in a statement.
Earlier this year, AP announced AI-powered projects that will publish Spanish-language news alerts and document public safety incidents in a Minnesota newspaper. The outlet also launched an AI search tool that’s supposed to make it easier for news partners to find photos and videos in its library based on “descriptive language.”
AP’s partnership with OpenAI seems like a natural next step, but there are still a lot of crucial details missing about how the outlet will use the technology. AP makes it clear it “does not use it in its news stories.”
Did you miss our previous article…
https://eyespypro.com/congressistrying-to-stop-discriminatory-algorithms-again/
Trending Spy News
Congress is trying to stop discriminatory algorithms again




US policymakers hope to require online platforms to disclose information about their algorithms and allow the government to intervene if these are found to discriminate based on criteria like race or gender.
Sen. Edward Markey (D-MA) and Rep. Doris Matsui (D-CA) reintroduced the Algorithmic Justice and Online Platform Transparency Act, which aims to ban the use of discriminatory or “harmful” automated decision-making. It would also establish safety standards, require platforms to provide a plain language explanation of algorithms used by websites, publish annual reports on content moderation practices, and create a governmental task force to investigate discriminatory algorithmic processes.
The bill applies to “online platforms” or any commercial, public-facing website or app that “provides a community forum for user-generated content.” This can include social media sites, content aggregation services, or media and file-sharing sites.
Markey and Matsui introduced a previous version of the bill in 2021. It moved to the Subcommittee on Consumer Protection and Commerce but died in committee.
Data-based decision-making, including social media recommendation algorithms or machine learning systems, often lives in proverbial black boxes. This opacity sometimes exists because of intellectual property concerns or a system’s complexity.
But lawmakers and regulators worry this could obscure biased decision-making with a huge impact on people’s lives, well beyond the reach of the online platforms the bill covers. Insurance companies, including those working with Medicaid patients, already use algorithms to grant or deny patient coverage. Agencies such as the FTC signaled in 2021 that they may pursue legal action against biased algorithms.
Calls to make more transparent algorithms have grown over the years. After several scandals in 2018 — which included the Cambridge Analytica debacle — AI research group AI Now found governments and companies don’t have a way to punish organizations that produce discriminatory systems. In a rare move, Facebook and Instagram announced the formation of a group to study potential racial bias in its algorithms.
“Congress must hold Big Tech accountable for its black-box algorithms that perpetuate discrimination, inequality, and racism in our society – all to make a quick buck,” Markey said in a statement.
Most proposed regulations around AI and algorithms include a push to create more transparency. The European Union’s proposed AI Act, in its final stages of negotiation, also noted the importance of transparency and accountability.
-
Trending Spy News2 months ago
FTC investigating OpenAI on ChatGPT data collection and publication of false information
-
Listening Devices3 months ago
Binoculars: The Detective’s Long-Distance Friend
-
Spy Cameras3 months ago
Surveillance Equipment: A Private Detective’s Best Friend
-
Listening Devices3 months ago
Protect Your Privacy: How To Detect And Defeat Surveillance
-
Spy Cameras3 months ago
Hidden Cameras: Unseen Eyes In Investigation
-
Security Devices3 months ago
Gps Trackers: Tracking Suspects With Precision
-
Spy Cameras3 months ago
Wireless Security Cameras: Keeping You Safe And Connected
-
Listening Devices3 months ago
Unleash Your Inner Spy With Cutting-Edge Gadgets!