Pages Navigation Menu

The blog of DataDiggers

Categories Navigation Menu

Former Dropbox exec Dennis Woodside joins Impossible Foods as its first President

Posted by on Mar 14, 2019 in California, Chief Operating Officer, cloud storage, Companies, computing, dennis woodside, Dropbox, executive, Food, food and drink, Google, Impossible foods, manufacturing, meat substitutes, Motorola Mobility, president, Redwood City, Singapore, supply chain, TC, United States | 0 comments

Former Google and Dropbox executive Dennis Woodside has joined the meat replacement developer Impossible Foods as the company’s first President.

Woodside, who previously shepherded Dropbox through its initial public offering, is a longtime technology executive who is making his first foray into the food business.

The 25-year tech industry veteran most recently served as the chief operating officer of Dropbox, and previously was the chief executive of Motorola Mobility after that company’s acquisition by Google.

“I love what Impossible Foods is doing: using science and technology to deliver delicious and nutritious foods that people love, in an environmentally sustainable way,” Woodside said. “I’m equally thrilled to focus on providing the award-winning Impossible Burger and future products to millions of consumers, restaurants and retailers.”

According to a statement, Woodside will be responsible for the company’s operations, manufacturing, supply chain, sales, marketing, human resources and other functions.

The company currently has a staff of 350 divided between its Redwood City, Calif. and Oakland manufacturing plant.

Impossible Foods now slings its burger in restaurants across the United States, Hong Kong, Macau and Singapore and is expecting to launch a grocery store product later this year.


Source: The Tech Crunch

Read More

GM Cruise snags Dropbox HR head to hire at least 1,000 engineers by end of year

Posted by on Mar 11, 2019 in Arden Hoffman, Artificial Intelligence, Automotive, autonomous vehicles, chief technology officer, cloud storage, computing, cruise, Cruise Automation, Dan Ammann, Dropbox, executive, General Motors, Google, honda, Kyle Vogt, Lidar, Personnel, San Francisco, Seattle, Softbank, Software, software engineering, strobe, Transportation | 0 comments

GM Cruise plans to hire hundreds of employees over the next nine months, doubling its engineering staff, TechCrunch has learned. It’s an aggressive move by the autonomous vehicle technology company to double its size as it pushes to deploy a robotaxi service by the end of the year. Arden Hoffman, who helped scale Dropbox, will leave the file-sharing and storage company to head up human resources at Cruise.

The GM subsidiary, which has more than 1,000 employees, is expanding its office space in San Francisco to accommodate the growth. GM Cruise will keep its headquarters at 1201 Bryant Street in San Francisco. The company will also take over Dropbox headquarters at 333 Brannan Street some time this year, a move that will triple Cruise’s office space in San Francisco.

“Arden has made a huge impact on Dropbox over the last four years. She helped build and scale our team and culture to the over 2300 person company we are today, and we‘ll miss her leadership, determination, and sense of humor. While we’re sorry to see her go, we’re excited for her and wish her all the best in this new opportunity to grow the team at Cruise,” a Dropbox spokesperson said in an emailed statement. 

Prior to joining Dropbox, Hoffman was human resources director at Google for three years.

The planned expansion and hiring of Hoffman follows a recent executive reshuffling. GM president Dan Ammann left the automaker in December and became CEO of Cruise. Ammann had been president of GM since 2014, and he was a central figure in the automaker’s 2016 acquisition of Cruise and its integration with GM.

Kyle Vogt,  a Cruise co-founder who was CEO and also unofficially handled the chief technology officer position, is now president and CTO.

Cruise has grown from a small startup with 40 employees to more than 1,000 today at its San Francisco headquarters. It has expanded to Seattle, as well, in pursuit of talent. Cruise announced plans in November to open an office in Seattle and staff it with up to 200 engineers. And with the recent investments by SoftBank and Honda, which has pushed Cruise’s valuation to $14.6 billion, it has the runway to double its staff.

The hunt for qualified people with backgrounds in software engineering, robotics and AI has heated up as companies race to develop and deploy autonomous vehicles. There are more than 60 companies that have permits from the California Department of Motor Vehicles to test autonomous vehicles in the state.

Competition over talent has led to generous, even outrageous, compensation packages and poaching of people with specific skills.

Cruise’s announcement puts more pressure on that ever-tightening pool of talent. Cruise has something that many other autonomous vehicle technology companies don’t — ready amounts of capital. In May, Cruise received a $2.25 billion investment by SoftBank’s vision fund. Honda also committed $2.75 billion as part of an exclusive agreement with GM and Cruise to develop and produce a new kind of autonomous vehicle.

As part of that agreement, Honda will invest $2 billion into the effort over the next 12 years. Honda also is making an immediate and direct equity investment of $750 million into Cruise.

Cruise will likely pursue a dual path of traditional recruitment and acquisitions to hit that 1,000-engineer mark. It’s a strategy Cruise is already pursuing. Last year, Cruise acquired Zippy.ai, which develops robots for last-mile grocery and package delivery, for an undisclosed amount of money. The deal was more of an acqui-hire and did not include any of Zippy’s product or intellectual property. Instead, it seems Cruise was more interested in the skill sets of the co-founders, Gabe Sibley, Alex Flint and Chris Broaddus, and their team.

In 2017, Cruise also acquired Strobe,  a LiDAR sensor maker. At the time, Cruise said Strobe would help it reduce by nearly 100 percent the cost of LiDAR on a per-vehicle basis.


Source: The Tech Crunch

Read More

How to recover quickly if you get locked out of Google

Posted by on Feb 1, 2019 in Cloud, cloud storage, customer support, Google, google one, Storage, TC | 0 comments

I know first-hand how frustrating it is to get locked out of your Google account and lose access to much of your online life. I’m hoping this simple work-around will help get you get through the account recovery process much faster than the manual method, which takes a minimum of 3-5 days (and in my case ended up taking weeks).

This week, a colleague who remembered my article on my lock-out experience asked me for advice after she was locked out of her account. And a solution occurred to me, one that I had actually discovered last year, but had never put to use myself. It worked for her, and I hope it works for you too. It’s actually pretty simple.

If you have paid storage on Google, follow these steps:

  1. Go to Google One.
  2. Click the Call button at the top of the screen.
  3. Tell the person who answers that you’re locked out. They should be able to help you.

Click the Call button at the top of the screen.

If you don’t have a Google One account, follow these steps:

  1. Go to Google One.
  2. Choose a monthly storage option. You can get started with a 100 gigs of storage for just $1.99 a month.
  3. After you set up your storage, click the Call button and tell them you’re locked out.

While I can’t absolutely guarantee this will help you get your Google account back in short order, I can tell you it worked flawlessly for my colleague and she got back into hers shortly after opening a Google One account. While some may object to paying, if you can afford to spend $23.88 a year for 100 gigs of storage and access to human tech support (for this or any problem you have), it could be well worth it if it solves your issue quickly and gives you overall peace of mind.


Source: The Tech Crunch

Read More

Massive mortgage and loan data leak gets worse as original documents also exposed

Posted by on Jan 24, 2019 in Amazon-S3, cloud storage, computer security, data breach, data security, database, email, Finance, Government, New York, ocr, Prevention, Privacy, Security, texas, United States, web browser | 1 comment

Remember that massive data leak of mortgage and loan data we reported on Wednesday?

In case you missed it, millions of documents were found leaking after an exposed Elasticsearch server was found without a password. The data contained highly sensitive financial data on tens of thousands of individuals who took out loans or mortgages over the past decade with U.S. financial institutions. The documents were converted using a technology called OCR from their original paper documents to a computer readable format and stored in the database, but they weren’t easy to read. That said, it was possible to discern names, addresses, birth dates, Social Security numbers and other private financial data by anyone who knew where to find the server.

Independent security researcher Bob Diachenko and TechCrunch traced the source of the leaking database to a Texas-based data and analytics company, Ascension. When reached, the company said that one of its vendors, OpticsML, a New York-based document management startup, had mishandled the data and was to blame for the data leak.

It turns out that data was exposed again — but this time, it was the original documents.

Diachenko found the second trove of data in a separate exposed Amazon S3 storage server, which too was not protected with a password. Anyone who went to an easy-to-guess web address in their web browser could have accessed the storage server and see — and download — the files stored inside.

In a note to TechCrunch, Diachenko said he was “very surprised” to find the server in the first place, let alone open and accessible. Because Amazon storage servers are private by default and aren’t accessible to the web, someone would have made a conscious decision to set its permissions to public.

The bucket contained 21 files containing 23,000 pages of PDF documents stitched together — or about 1.3 gigabytes in size. Diachenko said that portions of the data in the exposed Elasticsearch database on Wednesday matched data found in the Amazon S3 bucket, confirming that some or all of the data is the same as what was previously discovered. Like in Wednesday’s report, the server contained documents from banks and financial institutions across the U.S., including loans and mortgage agreements. We also found documents from U.S. Department of Housing and Urban Development, as well as W-2 tax forms, loan repayment schedules, and other sensitive financial information.

Two of the files — redacted — found on the exposed storage server. (Image: TechCrunch)

Many of the files also contained names, addresses, phone numbers, and Social Security numbers, and more.

When we tried to reach OpticsML on Wednesday, its website had been pulled offline and the listed phone number was disconnected. After scouring through old cached version of the site, we found an email address.

TechCrunch emailed chief executive Sean Lanning, and the bucket was secured within the hour.

Lanning acknowledged our email but did not comment. Instead, OpticsML chief technology officer John Brozena confirmed the breach in a separate email, but declined to answer several questions about the exposed data — including how long the bucket was open and why it was set to public.

“We are working with the appropriate authorities and a forensic team to analyze the full extent of the situation regarding the exposed Elasticsearch server,” said Brozena. “As part of this investigation we learned that 21 documents used for testing were made identifiable by the previously discussed Elasticsearch leak. These documents were taken offline promptly.”

He added that OpticsML is “working to notify all affected parties” when asked about informing customers and state regulators, as per state data breach notification laws.

But Diachenko said there was no telling how many times the bucket might have been accessed before it was discovered.

“I would assume that after such publicity like these guys had, first thing you would do is to check if your cloud storage is down or, at least, password-protected,” he said.


Source: The Tech Crunch

Read More

Storage provider Cloudian raises $94M

Posted by on Aug 29, 2018 in alpha, Artificial Intelligence, Cloud, cloud computing, cloud storage, Cloudian, computing, data management, Enterprise, funding, Goldman Sachs, healthcare, information, machine learning, medical imaging, NTT Docomo Ventures, petabyte, Storage | 0 comments

Cloudian, a company that specializes in helping businesses store petabytes of data, today announced that it has raised a $94 million Series E funding round. Investors in this round, which is one of the largest we have seen for a storage vendor, include Digital Alpha, Fidelity Eight Roads, Goldman Sachs, INCJ, JPIC (Japan Post Investment Corporation), NTT DOCOMO Ventures and WS Investments. This round includes a $25 million investment from Digital Alpha, which was first announced earlier this year.

With this, the seven-year-old company has now raised a total of $174 million.

As the company told me, it now has about 160 employees and 240 enterprise customers. Cloudian has found its sweet spot in managing the large video archives of entertainment companies, but its customers also include healthcare companies, automobile manufacturers and Formula One teams.

What’s important to stress here is that Cloudian’s focus is on on-premise storage, not cloud storage, though it does offer support for multi-cloud data management, as well. “Data tends to be most effectively used close to where it is created and close to where it’s being used,” Cloudian VP of worldwide sales Jon Ash told me. “That’s because of latency, because of network traffic. You can almost always get better performance, better control over your data if it is being stored close to where it’s being used.” He also noted that it’s often costly and complex to move that data elsewhere, especially when you’re talking about the large amounts of information that Cloudian’s customers need to manage.

Unsurprisingly, companies that have this much data now want to use it for machine learning, too, so Cloudian is starting to get into this space, as well. As Cloudian CEO and co-founder Michael Tso also told me, companies are now aware that the data they pull in, no matter whether that’s from IoT sensors, cameras or medical imaging devices, will only become more valuable over time as they try to train their models. If they decide to throw the data away, they run the risk of having nothing with which to train their models.

Cloudian plans to use the new funding to expand its global sales and marketing efforts and increase its engineering team. “We have to invest in engineering and our core technology, as well,” Tso noted. “We have to innovate in new areas like AI.”

As Ash also stressed, Cloudian’s business is really data management — not just storage. “Data is coming from everywhere and it’s going everywhere,” he said. “The old-school storage platforms that were siloed just don’t work anywhere.”


Source: The Tech Crunch

Read More