Pages Navigation Menu

The blog of DataDiggers

Categories Navigation Menu

Teams autonomously mapping the depths take home millions in Ocean Discovery Xprize

Posted by on May 31, 2019 in Artificial Intelligence, conservation, Gadgets, Hardware, Robotics, Science, TC, XPRIZE | 0 comments

There’s a whole lot of ocean on this planet, and we don’t have much of an idea what’s at the bottom of most of it. That could change with the craft and techniques created during the Ocean Discovery Xprize, which had teams competing to map the sea floor quickly, precisely and autonomously. The winner just took home $4 million.

A map of the ocean would be valuable in and of itself, of course, but any technology used to do so could be applied in many other ways, and who knows what potential biological or medical discoveries hide in some nook or cranny a few thousand fathoms below the surface?

The prize, sponsored by Shell, started back in 2015. The goal was, ultimately, to create a system that could map hundreds of square kilometers of the sea floor at a five-meter resolution in less than a day — oh, and everything has to fit in a shipping container. For reference, existing methods do nothing like this, and are tremendously costly.

But as is usually the case with this type of competition, the difficulty did not discourage the competitors — it only spurred them on. Since 2015, then, the teams have been working on their systems and traveling all over the world to test them.

Originally the teams were to test in Puerto Rico, but after the devastating hurricane season of 2017, the whole operation was moved to the Greek coast. Ultimately after the finalists were selected, they deployed their craft in the waters off Kalamata and told them to get mapping.

Team GEBCO’s surface vehicle

“It was a very arduous and audacious challenge,” said Jyotika Virmani, who led the program. “The test itself was 24 hours, so they had to stay up, then immediately following that was 48 hours of data processing after which they had to give us the data. It takes more trad companies about 2 weeks or so to process data for a map once they have the raw data — we’re pushing for real time.”

This wasn’t a test in a lab bath or pool. This was the ocean, and the ocean is a dangerous place. But amazingly there were no disasters.

“Nothing was damaged, nothing imploded,” she said. “We ran into weather issues, of course. And we did lose one piece of technology that was subsequently found by a Greek fisherman a few days later… but that’s another story.”

At the start of the competition, Virmani said, there was feedback from the entrants that the autonomous piece of the task was simply not going to be possible. But the last few years have proven it to be so, given that the winning team not only met but exceeded the requirements of the task.

“The winning team mapped more than 250 square kilometers in 24 hours, at the minimum of five meters resolution, but around 140 was more than five meters,” Virmani told me. “It was all unmanned: An unmanned surface vehicle that took the submersible out, then recovered it at sea, unmanned again, and brought it back to port. They had such great control over it — they were able to change its path and its programming throughout that 24 hours as they needed to.” (It should be noted that unmanned does not necessarily mean totally hands-off — the teams were permitted a certain amount of agency in adjusting or fixing the craft’s software or route.)

A five-meter resolution, if you can’t quite picture it, would produce a map of a city that showed buildings and streets clearly, but is too coarse to catch, say, cars or street signs. When you’re trying to map two-thirds of the globe, though, this resolution is more than enough — and infinitely better than the nothing we currently have. (Unsurprisingly, it’s also certainly enough for an oil company like Shell to prospect new deep-sea resources.)

The winning team was GEBCO, composed of veteran hydrographers — ocean mapping experts, you know. In addition to the highly successful unmanned craft (Sea-Kit, already cruising the English Channel for other purposes), the team did a lot of work on the data-processing side, creating a cloud-based solution that helped them turn the maps around quickly. (That may also prove to be a marketable service in the future.) They were awarded $4 million, in addition to their cash for being selected as a finalist.

The runner up was Kuroshio, which had great resolution but was unable to map the full 250 km2 due to weather problems. They snagged a million.

A bonus prize for having the submersible track a chemical signal to its source didn’t exactly have a winner, but the teams’ entries were so impressive that the judges decided to split the million between the Tampa Deep Sea Xplorers and Ocean Quest, which amazingly enough is made up mostly of middle-schoolers. The latter gets $800,000, which should help pay for a few new tools in the shop there.

Lastly, a $200,000 innovation prize was given to Team Tao out of the U.K., which had a very different style to its submersible that impressed the judges. While most of the competitors opted for a craft that went “lawnmower-style” above the sea floor at a given depth, Tao’s craft dropped down like a plumb bob, pinging the depths as it went down and back up before moving to a new spot. This provides a lot of other opportunities for important oceanographic testing, Virmani noted.

Having concluded the prize, the organization has just a couple more tricks up its sleeve. GEBCO, which stands for General Bathymetric Chart of the Oceans, is partnering with The Nippon Foundation on Seabed 2030, an effort to map the entire sea floor over the next decade and provide that data to the world for free.

And the program is also — why not? — releasing an anthology of short sci-fi stories inspired by the idea of mapping the ocean. “A lot of our current technology is from the science fiction of the past,” said Virmani. “So we told the authors, imagine we now have a high-resolution map of the sea floor, what are the next steps in ocean tech and where do we go?” The resulting 19 stories, written from all 7 continents (yes, one from Antarctica), will be available June 7.


Source: The Tech Crunch

Read More

Cat vs best and worst robot vacuum cleaners 

Posted by on May 11, 2019 in cat, Gadgets, Home Appliances, Home Automation, laser, Roborock S6, robot, robotic vacuum cleaner, Robotics, Rowenta, TC, Vacuum | 0 comments

If you’ve flirted with the idea of buying a robot vacuum you may also have stepped back from the brink in unfolding horror at the alphabetic soup of branded discs popping into view. Consumer choice sounds like a great idea until you’ve tried to get a handle on the handle-less vacuum space.

Amazon offers an A to Z linklist of “top brands” that’s only a handful of letters short of a full alphabetic set. The horror.

What awaits the unseasoned robot vacuum buyer as they resign themselves to hours of online research to try to inform — or, well, form — a purchase decision is a seeming endless permutation of robot vac reviews and round-ups.

Unfortunately there are just so many brands in play that all these reviews tend to act as fuel, feeding a growing black hole of indecision that sucks away at your precious spare time, demanding you spend more and more of it reading about robots that suck (when you could, let’s be frank, be getting on with the vacuuming task yourself) — only to come up for air each time even less convinced that buying a robot dirtbag is at all a good idea.

Reader, I know, because I fell into this hole. And it was hellish. So in the spirit of trying to prevent anyone else falling prey to convenience-based indecision I am — apologies in advance — adding to the pile of existing literature about robot vacuums with a short comparative account that (hopefully) helps cut through some of the chaff to the dirt-pulling chase.

Here’s the bottom line: Budget robot vacuums that lack navigational smarts are simply not worth your money, or indeed your time.

Yes, that’s despite the fact they are still actually expensive vacuum cleaners.

Basically these models entail overpaying for a vacuum cleaner that’s so poor you’ll still have to do most of the job yourself (i.e. with a non-robotic vacuum cleaner).

It’s the very worst kind of badly applied robotics.

Abandon hope of getting anything worth your money at the bottom end of the heap. I know this because, alas, I tried — opting, finally and foolishly (but, in my defence, at a point of near desperation after sifting so much virtual chaff the whole enterprise seemed to have gained lottery odds of success and I frankly just wanted my spare time back), for a model sold by a well-known local retailer.

It was a budget option but I assumed — or, well, hoped — the retailer had done its homework and picked a better-than-average choice. Or at least something that, y’know, could suck dust.

The brand in question (Rowenta) sat alongside the better known (and a bit more expensive) iRobot on the shop shelf. Surely that must count for something? I imagined wildly. Reader, that logic is a trap.

I can’t comment on the comparative performance of iRobot’s bots, which I have not personally tested, but I do not hesitate to compare a €180 (~$200) Rowenta-branded robot vacuum to a very expensive cat toy.

This robot vacuum was spectacularly successful at entertaining the cat — presumably on account of its dumb disposition, bouncing stupidly off of furniture owing to a total lack of navigational smarts. (Headbutting is a pretty big clue to how stupid a robot it is, as it’s never a stand-in for intelligence even when encountered in human form.)

Even more tantalizingly, from the cat’s point of view, the bot featured two white and whisker-like side brushes that protrude and spin at paw-tempting distance. In short: Pure robotic catnip.

The cat did not stop attacking the bot’s whiskers the whole time it was in operation. That certainly added to the obstacles getting in its way. But the more existential problem was it wasn’t sucking very much at all.

At the end of its first concluded ‘clean’, after it somehow managed to lurch its way back to first bump and finally hump its charging hub, I extracted the bin and had to laugh at the modest sized furball within. I’ve found larger clumps of dust gathering themselves in corners. So: Full marks for cat-based entertainment but as a vacuum cleaner it was horrible.

At this point I did what every sensible customer does when confronted with an abject lemon: Returned it for a full refund. And that, reader, might have been that for me and the cat and robot vacs. Who can be bothered to waste so much money and time for what appeared laughably incremental convenience? Even with a steady supply of cat fur to contend with.

But as luck would have it a Roborock representative emailed to ask if I would like to review their latest top-of-the-range model — which, at €549, does clock in at the opposite end of the price scale; ~3x the pitiful Rowenta. So of course I jumped at the chance to give the category a second spin — to see if a smarter device could impress me and not just tickle the cat’s fancy.

Clearly the price difference here, at the top vs the bottom of the range, is substantial. And yet, if you bought a car that was 3x times cheaper than a Ferrari you’d still expect not just that the wheels stay on but that it can actually get you somewhere, in good time and do so without making you horribly car sick.

Turns out buyers of robot vacuums need to tread far more carefully.

Here comes the bookending top-line conclusion: Robot vacuums are amazing. A modern convenience marvel. But — and it’s a big one — only if you’re willing to shell out serious cash to get a device that actually does the job intended.

Roborock S6: It’s a beast at gobbling your furry friend’s dander

Comparing the Roborock S6 and the Rowenta Smart Force Essential Aqua RR6971WH (to give it its full and equally terrible name) is like comparing a high-end electric car with a wind-up kid’s toy.

Where the latter product was so penny-pinching the company hadn’t even paid to include in the box a user manual that contained actual words — opting, we must assume, to save on translation costs by producing a comic packed with inscrutable graphics and bizarro don’t do diagrams which only served to cement the fast-cooling buyer’s conviction they’d been sold a total lemon — the Roborock’s box contains a well written paper manual that contains words and clearly labeled diagrams. What a luxury!

At the same time there’s not really that much you need to grok to get your head around operating the Roborock. After a first pass to familiarize yourself with its various functions it’s delightfully easy to use. It will even produce periodic vocal updates — such as telling you it’s done cleaning and is going back to base. (Presumably in case you start to worry it’s gone astray under the bed. Or that quiet industry is a front for brewing robotic rebellion against indentured human servitude.)

One button starts a full clean — and this does mean full thanks to on-board laser navigation that allows the bot to map the rooms in real-time. This means you get methodical passes, minimal headbutting and only occasional spots missed. (Another button will do a spot clean if the S6 does miss something or there’s a fresh spill that needs tidying — you just lift the bot to where you want it and hit the appropriate spot.)

There is an app too, if you want to access extra features like being able to tell it to go clean a specific room, schedule cleans or set no-go zones. But, equally delightfully, there’s no absolute need to hook the bot to your wi-fi just to get it to do its primary job. All core features work without the faff of having to connect it to the Internet — nor indeed the worry of who might get access to your room-mapping data. From a privacy point of view this wi-fi-less app-free operation is a major plus.

In a small apartment with hard flooring the only necessary prep is a quick check to clear stuff like charging cables and stray socks off the floor. You can of course park dining chairs on the table to offer the bot a cleaner sweep. Though I found the navigation pretty adept at circling chair legs. Sadly the unit is a little too tall to make it under the sofa.

The S6 includes an integrated mopping function, which works incredibly well on lino-style hard flooring (but won’t be any use if you only have carpets). To mop you fill the water tank attachment; velcro-fix a dampened mop cloth to the bottom; and slide-clip the whole unit under the bot’s rear. Then you hit the go button and it’ll vacuum and mop in the same pass.

In my small apartment the S6 had no trouble doing a full floor clean in under an hour, without needing to return to base to recharge in the middle. (Roborock says the S6 will drive for up to three hours on a single charge.)

It also did not seem to get confused by relatively dark flooring in my apartment — which some reviews had suggested can cause headaches for robot vacuums by confusing their cliff sensors.

After that first clean I popped the lid to check on the contents of the S6’s transparent lint bin — finding an impressive quantity of dusty fuzz neatly wadded therein. This was really just robot vacuum porn, though; the gleaming floors spoke for themselves on the quality of the clean.

The level of dust gobbled by the S6 vs the Rowenta underlines the quality difference between the bottom and top end of the robot vacuum category.

So where the latter’s plastic carapace immediately became a magnet for all the room dust it had kicked up but spectacularly failed to suck, the S6’s gleaming white shell has stayed remarkably lint-free, acquiring only a minimal smattering of cat hairs over several days of operation — while the floors it’s worked have been left visibly dust- and fur-free. (At least until the cat got to work dirtying them again.)

Higher suction power, better brushes and a higher quality integrated filter appear to make all the difference. The S6 also does a much better cleaning job a lot more quietly. Roborock claims it’s 50% quieter than the prior model (the S5) and touts it as its quietest robot vacuum yet.

It’s not super silent but is quiet enough when cleaning hard floors not to cause a major disturbance if you’re working or watching something in the same room. Though the novelty can certainly be distracting.

Even the look of the S6 exudes robotic smarts — with its raised laser-housing bump resembling a glowing orange cylonic eye-slot.

Although I was surprised, at first glance, by the single, rather feeble looking side brush vs the firm pair the Rowenta had fixed to its undercarriage. But again the S6’s tool is smartly applied — stepping up and down speed depending on what the bot’s tackling. I found it could miss the odd bit of lint or debris such as cat litter but when it did these specs stood out as the exception on an otherwise clean floor.

It’s also true that the cat did stick its paw in again to try attacking the S6’s single spinning brush. But these attacks were fewer and a lot less fervent than vs the Rowenta, as if the bot’s more deliberate navigation commanded greater respect and/or a more considered ambush. So it appears that even to a feline eye the premium S6 looks a lot less like a dumb toy.

Cat plots another ambush while the S6 works the floor

On a practical front, the S6’s lint bin has a capacity of 480ml. Roborock suggests cleaning it out weekly (assuming you’re using the bot every week), as well as washing the integrated dust filter (it supplies a spare in the box so you can switch one out to clean it and have enough time for it to fully dry before rotating it back into use).

If you use the mopping function the supplied reusable mop cloths do need washing afterwards too (Roborock also includes a few disposable alternatives in the box but that seems a pretty wasteful option when it’s easy enough to stick a reusable cloth in with a load of laundry or give it a quick wash yourself). So if you’re chasing a fully automated, robot-powered, end-to-cleaning-chores dream be warned there’s still a little human elbow grease required to keep everything running smoothly.

Still, there’s no doubt a top-of-the-range robot vacuum like the S6 will save you time cleaning.

If you can justify the not inconsiderable cost involved in buying this extra time by shelling out for a premium robot vacuum that’s smart enough to clean effectively all that’s left to figure out is how to spend your time windfall wisely — resisting the temptation to just put your feet up and watch the clever little robot at work.


Source: The Tech Crunch

Read More

Boston Dynamics showcases new uses for SpotMini ahead of commercial production

Posted by on Apr 20, 2019 in boston dynamics, Robotics, TC, TC Sessions: Robotics + AI | 0 comments

Last year at our TC Sessions: Robotics event, Boston Dynamics announced its intention to commercialize SpotMini. It was a big step for the secretive company. After a quarter of century building some of the world’s most sophisticated robots, it was finally taking a step into the commercial realm, making the quadrupedal robot available to anyone with the need and financial resources for the device.

CEO Marc Raibert made a return appearance at our event this week to discuss the progress Boston Dynamics has made in the intervening 12 months, both with regard to SpotMini and the company’s broader intentions to take a more market-based approach to a number of its creations.

The appearance came hot on the heels of a key acquisition for the company. In fact, Kinema was the first major acquisition in the company’s history — no doubt helped along by the very deep coffers of its parent company, SoftBank. The Bay Area-based startup’s imaging technology forms a key component to Boston Dynamics’ revamped version of its wheeled robot hand. With a newfound version system and its dual arms replaced with a multi-suction cupped gripper.

A recent video from the company demonstrated the efficiency and speed with which the system can be deployed to move boxes from shelf to conveyor belt. As Raibert noted onstage, Handle is the closest Boston Dynamics has come to a “purpose-built robot” — i.e. a robot designed from the ground up to perform a specific task. It marks a new focus for a company that, after its earliest days of DARPA-funded projects, appears to primarily be driven by the desire to create the world’s most sophisticated robots.

“We estimate that there’s about a trillion cubic foot boxes moved around the world every year,” says Raibert. “And most of it’s not automated. There’s really a huge opportunity there. And of course this robot is great for us, because it includes the DNA of a balancing robot and moving dynamically and having counterweights that let it reach a long way. So it’s not different, in some respects, from the robots we’ve been building for years. On the other hand, some of it is very focused on grasping, being able to see boxes and do tasks like stack them neatly together.”

The company will maintain a foot on that side of things, as well. Robots like the humanoid Atlas will still form an important piece of its work, even when no commercial applications are immediately apparent.

But once again, it was SpotMini who was the real star of the show. This time, however, the company debuted the version of the robot that will go into production. At first glance, the robot looked remarkably similar to the version we had onstage last year.

“We’ve we’ve redesigned many of the components to make it more reliable, to make the skins work better and to protect it if it does fall,” says Raibert.  “It has two sets [of cameras] on the front, and one on each side and one on the back. So we can see in all directions.”

I had have the opportunity to pilot the robot — making me one of a relatively small group of people outside of the Boston Dynamics offices who’ve had the opportunity to do so. While SpotMini has all of the necessary technology for autonomous movement, user control is possible and preferred in certain situations (some of which we’ll get to shortly).

[Gifs featured are sped up a bit from original video above]

The controller is an OEMed design that looks something like an Xbox controller with an elongated touchscreen in the middle. The robot can be controlled directly with the touchscreen, but I opted for a pair of joysticks. Moving Spot around is a lot like piloting a drone. One joystick moves the robot forward and back, the other turns it left and right.

Like a drone, it takes some getting used to, particularly with regard to the orientation of the robot. One direction is always forward for the robot, but not necessarily for the pilot. Tapping a button on the screen switches the joystick functionality to the arm (or “neck,” depending on your perspective). This can be moved around like a standard robotic arm/grasper. The grasper can also be held stationary, while the rest of the robot moves around it in a kind of shimmying fashion.

Once you get the hang of it, it’s actually pretty simple. In fact, my mother, whose video game experience peaked out at Tetris, was backstage at the event and happily took the controller from Boston Dynamics, controlling the robot with little issue.

Boston Dynamics is peeling back the curtain more than ever. During our conversation, Raibert debuted behind the scenes footage of component testing. It’s a sight to behold, with various pieces of the robot splayed out on lab bench. It’s a side of Boston Dynamics we’ve not really seen before. Ditto for the images of large Spot Mini testing corrals, where several are patrolling around autonomously.

Boston Dynamics also has a few more ideas of what the future could look like for the robot. Raibert shared footage of Massachusetts State Police utilizing spot in different testing scenarios, where the robot’s ability to open doors could potentially get human officers out of harm’s way during a hostage or terrorist situation.

Another unit was programmed to autonomously patrol a construction site in Tokyo, outfitted with a Street View-style 360 camera, so it can monitor building progress. “This lets the construction company get an assessment of progress at their site,” he explains. “You might think that that’s a low end task. But these companies have thousands of sites. And they have to patrol them at least a couple of times a week to know where they are in progress. And they’re anticipating using Spot for that. So we have over a dozen construction companies lined up to do tests at various stages of testing and proof of concept in their scenarios.”

Raibert says the Spot Mini is still on track for a July release. The company plans to manufacture around 100 in its initial run, though it’s still not ready to talk about pricing.


Source: The Tech Crunch

Read More

Breeze Automation is building soft robots for the Navy and NASA

Posted by on Apr 18, 2019 in breeze automation, Gui Cavalcanti, MegaBots, Robotics, TC Sessions: Robotics + AI 2019 | 0 comments

San Francisco soft robotics startup Breeze Automation made its debut today onstage at TechCrunch’s TC Sessions: Robotics + AI event at UC Berkeley. Co-founder and CEO Gui Cavalcanti joined us onstage at the event to showcase the contract work the company has been doing for organizations like NASA and the U.S. Navy.

Cavalcanti last joined TechCrunch onstage in September 2016, decked out in aviator sunglasses and full American flag regalia as a co-founder of fighting robot league MegaBots. These days, however, the Boston Dynamics alum’s work is a lot more serious and subdued, solving problems in dangerous settings like under water and outer space.

Developed as part of San Francisco R&D facility Otherlab, Breeze leverages the concept of highly adaptable soft robotics. The company’s robotic arms are air-filled fabric structures.

“The concept Otherlab has been developing for around seven years has been this idea of Fluidic Robots, hydraulic and Pneumatic Robots that are very cheap,” Cavalcanti told TechCrunch in a conversation ahead of today’s event. “Very robust to the environment and made with very lightweight materials. The original concept was, what is the simplest possible robot you can make, and what is the lightest robot you can make? What that idea turned into was these robots made of fabric and air.”

Breeze separates from much of the competition in the soft robotics space by applying these principles to the entire structure, instead of just a, say, gripper on the end of a more traditional robotic arm.

“All of that breaks down the second you get out of those large factories, and the question of how do robots interact to the real world becomes a lot more pressing,” Cavalcanti says. “What we’re trying to do is take a lot more of the research around soft robotics and the advantages of being fully sealed systems that are moved with really compliant sources of actuation like air. It turns out that when you’re trying to interact with an environment that’s unpredictable or unstructured, and you’re going to bump into things and you’re going to not get it right because you don’t have full sensing of the state of the world. There’s a lot of advantages to having entire manipulators and arms be soft instead of just the end effector.”

Breeze showcased several works in progress, including a system developed for the Navy that uses an HTC Vive headset for remote operation. The company’s work with NASA, meanwhile, involves the creation of a robotic system that doesn’t require a central drive shaft, marking a departure from more traditional robotic systems.

“You’re now looking at robot joints that can handle significant loads, that could be entirely injection molded,” explains Cavalcanti. “You don’t need a metal shaft, you don’t need a set of bearings or whatever. You can just have a bunch of injection mold, or plastic pieces that’s put together and there’s your robot.”

Most of the company’s funding is currently coming from federal contracts from places like the Navy and NASA, but going forward, Breeze is shifting more toward commercial contracts. “Our mission right now is to harden our technology and prepare for real-world application, and that is pretty much 100 percent our focus,” he says. “Once we do harden it, there are a variety of options for going commercial that we’d like to explore.”


Source: The Tech Crunch

Read More

Industrial robotics giant Fanuc is using AI to make automation even more automated

Posted by on Apr 18, 2019 in Artificial Intelligence, Asia, bin-picking, fanuc, industrial automation, Industrial Robotics, manufacturing, Robotics, TC Sessions: Robotics + AI | 0 comments

Industrial automation is already streamlining the manufacturing process, but first those machines must be painstakingly trained by skilled engineers. Industrial robotics giant Fanuc wants to make robots easier to train, therefore making automation more accessible to a wider range of industries, including pharmaceuticals. The company announced a new artificial intelligence-based tool at TechCrunch’s Robotics + AI Sessions event today that teaches robots how to pick the right objects out of a bin with simple annotations and sensor technology, reducing the training process by hours.

Bin-picking is exactly what it sounds like: a robot arm is trained to pick items out of bins and used for tedious, time-consuming tasks like sorting bulk orders of parts. Images of example parts are taken with a camera for the robot to match with vision sensors. Then the conventional process of training bin-picking robots means teaching it many rules so it knows what parts to pick up.

“Making these rules in the past meant having to through a lot of iterations and trial and error. It took time and was very cumbersome,” said Dr. Kiyonori Inaba, the head of Fanuc Corporation’s Robot Business Division, during a conversation ahead of the event.

These rules include details like how to locate the parts on the top of the pile or which ones are the most visible. Then after that, human operators need to tell it when it makes an error in order to refine its training. In industries that are relatively new to automation, finding enough engineers and skilled human operators to train robots can be challenging.

This is where Fanuc’s new AI-based tool comes in. It simplifies the training process so the human operator just needs to look at a photo of parts jumbled in a bin on a screen and tap a few examples of what needs to be picked up, like showing a small child how to sort toys. This is significantly less training than what typical AI-based vision sensors need and can also be used to train several robots at once.

“It is really difficult for the human operator to show the robot how to move in the same way the operator moves things,” said Inaba. “But by utilizing AI technology, the operator can teach the robot more intuitively than conventional methods.” He adds that the technology is still in its early stages and it remains to be seen if it can be used during in assembly as well.


Source: The Tech Crunch

Read More

Aptiv takes its self-driving car ambitions (and tech) to China

Posted by on Apr 17, 2019 in Aptiv, Automation, Automotive, automotive industry, boston, China, Co-founder, Delphi, Emerging-Technologies, Karl Iagnemma, Las Vegas, Lyft, manufacturing, NuTonomy, pittsburgh, president, Robotics, self driving cars, shanghai, Singapore, transport, Transportation, United States | 0 comments

Aptiv, the U.S. auto supplier and self-driving software company, is opening an autonomous mobility center in Shanghai to focus on the development and eventual deployment of its technology on public roads.

The expansion marks the fifth market where Aptiv has set up R&D, testing or operational facilities. Aptiv has autonomous driving operations in Boston, Las Vegas, Pittsburgh and Singapore. But China is perhaps its most ambitious endeavor yet.

Aptiv has never had any AV operations in China, but it does have a long history in the country including manufacturing and engineering facilities. The company, in its earlier forms as Delphi and Delco has been in China since 1993 — experience that will be invaluable as it tries to bring its autonomous vehicle efforts into a new market, Aptiv Autonomous Mobility President Karl Iagnemma told TechCrunch in a recent interview.

“The long-term opportunity in China is off the charts,” Iagnemma said, noting a recent McKinsey study that claims the country will host two-thirds of the world’s autonomous driven miles by 2040 and be trillion-dollar mobility service opportunity.

“For Aptiv, it’s always been a question of not ‘if’, but when we’re going to enter the Chinese market,” he added.

Aptiv will have self-driving cars testing on public roads by the second half of 2019.

“Our experience in other markets has shown that in this industry, you learn by doing,” Iagnemma explained.

And it’s remark that Iagnemma can stand by. Iagnemma is the co-founder of self-driving car startup nuTonomy, one of the first to launch a robotaxi service in 2016 in Singapore that the public—along with human safety drivers — could use.

NuTonomy was acquired by Delphi in 2017 for $450 million. NuTonomy became part of Aptiv after its spinoff from Delphi was complete.

Aptiv is also in discussions with potential partners for mapping and commercial deployment of Aptiv’s vehicles in China.

Some of those partnerships will likely mimic the types of relationships Aptiv has created here in the U.S., notably with Lyft . Aptiv’s self-driving vehicles operate on Lyft’s ride-hailing platform in Las Vegas and have provided more than 40,000 paid autonomous rides in Las Vegas via the Lyft app.

Aptiv will also have to create new kinds of partnerships unlike those it has in the U.S. due to restrictions and rules in China around data collection, intellectual property and creating high resolution map data.


Source: The Tech Crunch

Read More

Disney/Lucasfilm donates $1.5 million to FIRST

Posted by on Apr 13, 2019 in Disney, Education, first, lucasfilm, Robotics | 0 comments

A day after the big Episode IX reveal, Disney and subsidiary Lucas film announced that it will be donating $1.5 million to FIRST . The non-profit group was founded by Dean Kamen in 1989 to help teach STEM through initiatives like robotics competitions.

Disney’s money will go to provide education and outreach to the underserved communities on which FIRST focuses. Details are pretty thin on precisely what the partnership will entail, but Disney’s certainly got a lot to gain from this sort of outreach — and Lucasfilm knows a thing or two about robots.

The Star Wars: Force for Change announcement was made in conjunction with Lucasfilm’s annual Star Wars Celebration in Chicago. Yesterday the event hosted a panel with the cast of the upcoming film that included a teaser trailer and title reveal.

“Star Wars has always inspired young people to look past what is and imagine a world beyond,” Lucasfilm president Kathleen Kennedy said in a release tied to the news. “It is crucial that we pass on the importance of science and technology to young people—they will be the ones who will have to confront the global challenges that lie ahead. To support this effort, Lucasfilm and Disney are teaming up with FIRST to bring learning opportunities and mentorship to the next generation of innovators.”

It’s been a good week for FIRST investments. Just yesterday Amazon announced its own commitment to the group’s robotics offerings.


Source: The Tech Crunch

Read More

Mars helicopter bound for the Red Planet takes to the air for the first time

Posted by on Mar 28, 2019 in drones, Gadgets, Government, Hardware, jpl, mars 2020, mars helicopter, NASA, Robotics, Science, Space, TC, UAVs | 0 comments

The Mars 2020 mission is on track for launch next year, and nesting inside the high-tech new rover heading that direction is a high-tech helicopter designed to fly in the planet’s nearly non-existent atmosphere. The actual aircraft that will fly on the Martian surface just took its first flight and its engineers are over the moon.

“The next time we fly, we fly on Mars,” said MiMi Aung, who manages the project at JPL, in a news release. An engineering model that was very close to final has over an hour of time in the air, but these two brief test flights were the first and last time the tiny craft will take flight until it does so on the distant planet (not counting its “flight” during launch).

“Watching our helicopter go through its paces in the chamber, I couldn’t help but think about the historic vehicles that have been in there in the past,” she continued. “The chamber hosted missions from the Ranger Moon probes to the Voyagers to Cassini, and every Mars rover ever flown. To see our helicopter in there reminded me we are on our way to making a little chunk of space history as well.”

Artist’s impression of how the helicopter will look when it’s flying on Mars.

A helicopter flying on Mars is much like a helicopter flying on Earth, except of course for the slight differences that the other planet has a third less gravity and 99 percent less air. It’s more like flying at 100,000 feet, Aung suggested.

It has its own solar panel so it can explore more or less on its own.

The test rig they set up not only produces a near-vacuum, replacing the air with a thin, Mars-esque CO2 mix, but a “gravity offload” system simulates lower gravity by giving the helicopter a slight lift via a cable.

It flew at a whopping 2 inches of altitude for a total of a minute in two tests, which was enough to show the team that the craft (with all its 1,500 parts and four pounds) was ready to package up and send to the Red Planet.

“It was a heck of a first flight,” said tester Teddy Tzanetos. “The gravity offload system performed perfectly, just like our helicopter. We only required a 2-inch hover to obtain all the data sets needed to confirm that our Mars helicopter flies autonomously as designed in a thin Mars-like atmosphere; there was no need to go higher.”

A few months the Mars 2020 rover has landed, the helicopter will detach and do a few test flights of up to 90 seconds. Those will be the first heavier-than-air flights on another planet — powered flight, in other words, rather than, say, a balloon filled with gaseous hydrogen.

The craft will operate mostly autonomously, since the half-hour round trip for commands would be far too long for an Earth-based pilot to operate it. It has its own solar cells and batteries, plus little landing feet, and will attempt flights of increasing distance from the rover over a 30-day period. It should go about three meters in the air and may eventually get hundreds of meters away from its partner.

Mars 2020 is estimated to be ready to launch next summer, arriving at its destination early in 2021. Of course in the meantime we’ve still got Curiosity and Insight up there, so if you want the latest from Mars, you’ve got plenty of options to choose from.


Source: The Tech Crunch

Read More

UC Berkeley’s Ken Goldberg and Michael I. Jordan will discuss AI at TC Sessions: Robotics + AI April 18

Posted by on Mar 14, 2019 in Anthony Levandowski, articles, Artificial Intelligence, cofounder, colin angle, editor-in-chief, Events, marc raibert, Robotics, TC, TC Sessions: Robotics + AI, TC Sessions: Robotics+AI 2019, uc-berkeley | 0 comments

We’re just over a month out from our TC Sessions: Robotics + AI event at UC Berkeley on April 18. We’ve already announced a number of marquee guests for the event, including Marc Raibert, Colin Angle, Melonee Wise and Anthony Levandowski. Today we’ve got another exciting panel to unveil and, as an FYI, our early-bird sale ends Friday!

This is our third robotics event, but it’s the first time artificial intelligence has shared the spotlight. Today we’re revealing that two of UC Berkeley’s top names in the space will be sharing the stage to discuss the role of AI in society for a panel titled “Artificial Intelligence: Minds, Economies and Systems that Learn.”

The pair of professors will be discussing how AI grew to become one of modern society’s most ubiquitous and wide-ranging technologies. The panel will also explore where the tech will go from here.

Ken Goldberg is a professor of Industrial Engineering and Operations Research at UC Berkeley. He has co-authored more than 200 peer-reviewed papers on automation, robotics and social information. He is the editor-in-chief of IEEE Transactions on Automation Science and Engineering and co-founder of the Berkeley Center for New Media.

Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at UC Berkeley. His work touches on a wide range of topics, including computer science, AI and computational biology. He is a member of the National Academy of Engineering, the American Academy of Arts and Sciences and a Fellow of the American Association for the Advancement of Science.

Early-bird ticket sales end tomorrow, Friday. Book your tickets today and save $100 before prices increase.

Students, grab your discounted $45 tickets here.

Startups, make sure to check out our demo table packages, which include three tickets, for just $1,500.


Source: The Tech Crunch

Read More

MIT’s deflated balloon robot hand can pick up objects 100x its own weight

Posted by on Mar 14, 2019 in CSAIL, harvard, MIT, Robotics, soft robot | 0 comments

Soft, biologically inspired robots have become one of the field’s most exciting offshoots, with machines that are capable of squeezing between obstacles and conforming to the world around them. A joint project between MIT CSAIL and Harvard’s Wyss converts those learnings into a simple, soft robotic gripper capable of handling delicate objects and picking up things up to 100x its own weight.

The gripper itself is made of an origami-inspired skeletal structure, covered in either fabric or a deflated balloon. It’s a principle the team recently employed on another project designed to create low-cost artificial muscles. A connector attaches the gripper to the arm and also sports a vacuum tube that sucks air out from the gripper, collapsing it around an object.

Like Soft Robotics’ commercial gripper, the malleable nature of the device means it grab hold of a wide range of different objects with less need for a complex vision system. It also means that it can grab hold of delicate items without damaging them in the process.

“Previous approaches to the packing problem could only handle very limited classes of objects — objects that are very light or objects that conform to shapes such as boxes and cylinders, but with the Magic Ball gripper system we’ve shown that we can do pick-and-place tasks for a large variety of items ranging from wine bottles to broccoli, grapes and eggs,” MIT professor Daniela Rus says in a release tied to the news. “In other words, objects that are heavy and objects that are light. Objects that are delicate, or sturdy, or that have regular or free form shapes.”


Source: The Tech Crunch

Read More