Search the Community
Showing results for tags 'self-driving vehicle'.
-
Technology is moving at a rapid rate and it seems every aspect of our lives are affected by it. Take for example cars, many believe in the future that cars will be fully self-driving and be powered by some sort of alternative powertrain. One person believes that kids born now will not even need a driver's license because of this technology. Henrik Christensen, head of UC San Diego’s Contextual Robotics Institute said in an interview with The San Diego Union-Tribune said in about 10 to 15 years, autonomous vehicles will be a regular part of our lives. “My own prediction is that kids born today will never get to drive a car. Autonomous, driverless cars are 10, 15 years out. All the automotive companies — Daimler, GM, Ford — are saying that within five years they will have autonomous, driverless cars on the road,” said Christensen. The paper asked Christensen his feelings on future generations not driving, he said, “I love to drive my car, but it’s a question of how much time people waste sitting in traffic and not doing something else. The average person in San Diego probably spends an hour commuting every day. If they could become more productive, that would be good. With autonomous, driverless cars, we can put twice as many vehicles on the road as we have today, and do it without improving the infrastructure.” Christensen also believes car ownership will be a thing of the past as well. “There would be no need to have parking garages in downtown San Diego. In theory, you’d get out of the car and say, ‘Pick me up at 4 PM.’ Long-term — we’re talking 20 years into the future — you’re not even going to own a car. A car becomes a service.” Do you think Christensen is on the right track or is his head in the clouds? Source: The San Diego Union-Tribune View full article
- 3 replies
-
- autonomous
- future
-
(and 1 more)
Tagged with:
-
Technology is moving at a rapid rate and it seems every aspect of our lives are affected by it. Take for example cars, many believe in the future that cars will be fully self-driving and be powered by some sort of alternative powertrain. One person believes that kids born now will not even need a driver's license because of this technology. Henrik Christensen, head of UC San Diego’s Contextual Robotics Institute said in an interview with The San Diego Union-Tribune said in about 10 to 15 years, autonomous vehicles will be a regular part of our lives. “My own prediction is that kids born today will never get to drive a car. Autonomous, driverless cars are 10, 15 years out. All the automotive companies — Daimler, GM, Ford — are saying that within five years they will have autonomous, driverless cars on the road,” said Christensen. The paper asked Christensen his feelings on future generations not driving, he said, “I love to drive my car, but it’s a question of how much time people waste sitting in traffic and not doing something else. The average person in San Diego probably spends an hour commuting every day. If they could become more productive, that would be good. With autonomous, driverless cars, we can put twice as many vehicles on the road as we have today, and do it without improving the infrastructure.” Christensen also believes car ownership will be a thing of the past as well. “There would be no need to have parking garages in downtown San Diego. In theory, you’d get out of the car and say, ‘Pick me up at 4 PM.’ Long-term — we’re talking 20 years into the future — you’re not even going to own a car. A car becomes a service.” Do you think Christensen is on the right track or is his head in the clouds? Source: The San Diego Union-Tribune
- 3 comments
-
- autonomous
- future
-
(and 1 more)
Tagged with:
-
Ford is making a big push in autonomous vehicles by announcing plans to put a fully autonomous car into production by 2021. But you will not be able to buy one. Instead, Ford will be building these vehicles for ride-sharing services. “The next decade will be defined by automation of the automobile, and we see autonomous vehicles as having as significant an impact on society as Ford’s moving assembly line did 100 years ago. We’re dedicated to putting on the road an autonomous vehicle that can improve safety and solve social and environmental challenges for millions of people – not just those who can afford luxury vehicles," said Mark Fields, Ford president and CEO in a statement yesterday. Ford says the vehicles will feature autonomous technology that is classified by the SAE as 'level 4' - a vehicle will perform all critical driving functions and monitor the road during a trip. The company expects their first autonomous vehicles to not have a steering wheel or pedals. To make this goal, Ford will be increasing their autonomous vehicle test fleet to 30 vehicles by the end of the year. Next year will see the company tripling that amount to 90 vehicles. Ford also announced a number of collaborations and investments into various companies that include, Velodyne: Light detection and ranging (LiDAR) sensors SAIPS: Computer vision and machine learning company to help with artificial intelligence and enhancing computer vision Nirenberg Neuroscience LLC: Machine vision company specializing in object recognition Civil Maps: Develops high-resolution 3D maps Source: Ford [[Template core/global/embed/video does not exist. This theme may be out of date. Run the support tool in the AdminCP to restore the default theme.]] Press Release is on Page 2 FORD TARGETS FULLY AUTONOMOUS VEHICLE FOR RIDE SHARING IN 2021; INVESTS IN NEW TECH COMPANIES, DOUBLES SILICON VALLEY TEAM Ford announces intention to deliver high-volume, fully autonomous vehicle for ride sharing in 2021 Ford investing in or collaborating with four startups on autonomous vehicle development Company also doubling Silicon Valley team and more than doubling Palo Alto campus PALO ALTO, Calif., Aug. 16, 2016 – Ford today announces its intent to have a high-volume, fully autonomous SAE level 4-capable vehicle in commercial operation in 2021 in a ride-hailing or ride-sharing service. To get there, the company is investing in or collaborating with four startups to enhance its autonomous vehicle development, doubling its Silicon Valley team and more than doubling its Palo Alto campus. “The next decade will be defined by automation of the automobile, and we see autonomous vehicles as having as significant an impact on society as Ford’s moving assembly line did 100 years ago,” said Mark Fields, Ford president and CEO. “We’re dedicated to putting on the road an autonomous vehicle that can improve safety and solve social and environmental challenges for millions of people – not just those who can afford luxury vehicles.” Autonomous vehicles in 2021 are part of Ford Smart Mobility, the company’s plan to be a leader in autonomous vehicles, as well as in connectivity, mobility, the customer experience, and data and analytics. Driving autonomous vehicle leadership Building on more than a decade of autonomous vehicle research and development, Ford’s first fully autonomous vehicle will be a Society of Automotive Engineers-rated level 4-capable vehicle without a steering wheel or gas and brake pedals. It is being specifically designed for commercial mobility services, such as ride sharing and ride hailing, and will be available in high volumes. “Ford has been developing and testing autonomous vehicles for more than 10 years,” said Raj Nair, Ford executive vice president, Global Product Development, and chief technical officer. “We have a strategic advantage because of our ability to combine the software and sensing technology with the sophisticated engineering necessary to manufacture high-quality vehicles. That is what it takes to make autonomous vehicles a reality for millions of people around the world.” This year, Ford will triple its autonomous vehicle test fleet to be the largest test fleet of any automaker – bringing the number to about 30 self-driving Fusion Hybrid sedans on the roads in California, Arizona and Michigan, with plans to triple it again next year. Ford was the first automaker to begin testing its vehicles at Mcity, University of Michigan’s simulated urban environment, the first automaker to publicly demonstrate autonomous vehicle operation in the snow and the first automaker to test its autonomous research vehicles at night, in complete darkness, as part of LiDAR sensor development. To deliver an autonomous vehicle in 2021, Ford is announcing four key investments and collaborations that are expanding its strong research in advanced algorithms, 3D mapping, LiDAR, and radar and camera sensors: Velodyne: Ford has invested in Velodyne, the Silicon Valley-based leader in light detection and ranging (LiDAR) sensors. The aim is to quickly mass-produce a more affordable automotive LiDAR sensor. Ford has a longstanding relationship with Velodyne, and was among the first to use LiDAR for both high-resolution mapping and autonomous driving beginning more than 10 years ago SAIPS: Ford has acquired the Israel-based computer vision and machine learning company to further strengthen its expertise in artificial intelligence and enhance computer vision. SAIPS has developed algorithmic solutions in image and video processing, deep learning, signal processing and classification. This expertise will help Ford autonomous vehicles learn and adapt to the surroundings of their environment Nirenberg Neuroscience LLC: Ford has an exclusive licensing agreement with Nirenberg Neuroscience, a machine vision company founded by neuroscientist Dr. Sheila Nirenberg, who cracked the neural code the eye uses to transmit visual information to the brain. This has led to a powerful machine vision platform for performing navigation, object recognition, facial recognition and other functions, with many potential applications. For example, it is already being applied by Dr. Nirenberg to develop a device for restoring sight to patients with degenerative diseases of the retina. Ford’s partnership with Nirenberg Neuroscience will help bring humanlike intelligence to the machine learning modules of its autonomous vehicle virtual driver system Civil Maps: Ford has invested in Berkeley, California-based Civil Maps to further develop high-resolution 3D mapping capabilities. Civil Maps has pioneered an innovative 3D mapping technique that is scalable and more efficient than existing processes. This provides Ford another way to develop high-resolution 3D maps of autonomous vehicle environments Silicon Valley expansion Ford also is expanding its Silicon Valley operations, creating a dedicated campus in Palo Alto. Adding two new buildings and 150,000 square feet of work and lab space adjacent to the current Research and Innovation Center, the expanded campus grows the company’s local footprint and supports plans to double the size of the Palo Alto team by the end of 2017. “Our presence in Silicon Valley has been integral to accelerating our learning and deliverables driving Ford Smart Mobility,” said Ken Washington, Ford vice president, Research and Advanced Engineering. “Our goal was to become a member of the community. Today, we are actively working with more than 40 startups, and have developed a strong collaboration with many incubators, allowing us to accelerate development of technologies and services.” Since the new Ford Research and Innovation Center Palo Alto opened in January 2015, the facility has rapidly grown to be one of the largest automotive manufacturer research centers in the region. Today, it is home to more than 130 researchers, engineers and scientists, who are increasing Ford’s collaboration with the Silicon Valley ecosystem. Research and Innovation Center Palo Alto’s multi-disciplinary research and innovation facility is the newest of nearly a dozen of Ford’s global research, innovation, IT and engineering centers. The expanded Palo Alto campus opens in mid-2017.
-
Ford is making a big push in autonomous vehicles by announcing plans to put a fully autonomous car into production by 2021. But you will not be able to buy one. Instead, Ford will be building these vehicles for ride-sharing services. “The next decade will be defined by automation of the automobile, and we see autonomous vehicles as having as significant an impact on society as Ford’s moving assembly line did 100 years ago. We’re dedicated to putting on the road an autonomous vehicle that can improve safety and solve social and environmental challenges for millions of people – not just those who can afford luxury vehicles," said Mark Fields, Ford president and CEO in a statement yesterday. Ford says the vehicles will feature autonomous technology that is classified by the SAE as 'level 4' - a vehicle will perform all critical driving functions and monitor the road during a trip. The company expects their first autonomous vehicles to not have a steering wheel or pedals. To make this goal, Ford will be increasing their autonomous vehicle test fleet to 30 vehicles by the end of the year. Next year will see the company tripling that amount to 90 vehicles. Ford also announced a number of collaborations and investments into various companies that include, Velodyne: Light detection and ranging (LiDAR) sensors SAIPS: Computer vision and machine learning company to help with artificial intelligence and enhancing computer vision Nirenberg Neuroscience LLC: Machine vision company specializing in object recognition Civil Maps: Develops high-resolution 3D maps Source: Ford [[Template core/global/embed/video does not exist. This theme may be out of date. Run the support tool in the AdminCP to restore the default theme.]] Press Release is on Page 2 FORD TARGETS FULLY AUTONOMOUS VEHICLE FOR RIDE SHARING IN 2021; INVESTS IN NEW TECH COMPANIES, DOUBLES SILICON VALLEY TEAM Ford announces intention to deliver high-volume, fully autonomous vehicle for ride sharing in 2021 Ford investing in or collaborating with four startups on autonomous vehicle development Company also doubling Silicon Valley team and more than doubling Palo Alto campus PALO ALTO, Calif., Aug. 16, 2016 – Ford today announces its intent to have a high-volume, fully autonomous SAE level 4-capable vehicle in commercial operation in 2021 in a ride-hailing or ride-sharing service. To get there, the company is investing in or collaborating with four startups to enhance its autonomous vehicle development, doubling its Silicon Valley team and more than doubling its Palo Alto campus. “The next decade will be defined by automation of the automobile, and we see autonomous vehicles as having as significant an impact on society as Ford’s moving assembly line did 100 years ago,” said Mark Fields, Ford president and CEO. “We’re dedicated to putting on the road an autonomous vehicle that can improve safety and solve social and environmental challenges for millions of people – not just those who can afford luxury vehicles.” Autonomous vehicles in 2021 are part of Ford Smart Mobility, the company’s plan to be a leader in autonomous vehicles, as well as in connectivity, mobility, the customer experience, and data and analytics. Driving autonomous vehicle leadership Building on more than a decade of autonomous vehicle research and development, Ford’s first fully autonomous vehicle will be a Society of Automotive Engineers-rated level 4-capable vehicle without a steering wheel or gas and brake pedals. It is being specifically designed for commercial mobility services, such as ride sharing and ride hailing, and will be available in high volumes. “Ford has been developing and testing autonomous vehicles for more than 10 years,” said Raj Nair, Ford executive vice president, Global Product Development, and chief technical officer. “We have a strategic advantage because of our ability to combine the software and sensing technology with the sophisticated engineering necessary to manufacture high-quality vehicles. That is what it takes to make autonomous vehicles a reality for millions of people around the world.” This year, Ford will triple its autonomous vehicle test fleet to be the largest test fleet of any automaker – bringing the number to about 30 self-driving Fusion Hybrid sedans on the roads in California, Arizona and Michigan, with plans to triple it again next year. Ford was the first automaker to begin testing its vehicles at Mcity, University of Michigan’s simulated urban environment, the first automaker to publicly demonstrate autonomous vehicle operation in the snow and the first automaker to test its autonomous research vehicles at night, in complete darkness, as part of LiDAR sensor development. To deliver an autonomous vehicle in 2021, Ford is announcing four key investments and collaborations that are expanding its strong research in advanced algorithms, 3D mapping, LiDAR, and radar and camera sensors: Velodyne: Ford has invested in Velodyne, the Silicon Valley-based leader in light detection and ranging (LiDAR) sensors. The aim is to quickly mass-produce a more affordable automotive LiDAR sensor. Ford has a longstanding relationship with Velodyne, and was among the first to use LiDAR for both high-resolution mapping and autonomous driving beginning more than 10 years ago SAIPS: Ford has acquired the Israel-based computer vision and machine learning company to further strengthen its expertise in artificial intelligence and enhance computer vision. SAIPS has developed algorithmic solutions in image and video processing, deep learning, signal processing and classification. This expertise will help Ford autonomous vehicles learn and adapt to the surroundings of their environment Nirenberg Neuroscience LLC: Ford has an exclusive licensing agreement with Nirenberg Neuroscience, a machine vision company founded by neuroscientist Dr. Sheila Nirenberg, who cracked the neural code the eye uses to transmit visual information to the brain. This has led to a powerful machine vision platform for performing navigation, object recognition, facial recognition and other functions, with many potential applications. For example, it is already being applied by Dr. Nirenberg to develop a device for restoring sight to patients with degenerative diseases of the retina. Ford’s partnership with Nirenberg Neuroscience will help bring humanlike intelligence to the machine learning modules of its autonomous vehicle virtual driver system Civil Maps: Ford has invested in Berkeley, California-based Civil Maps to further develop high-resolution 3D mapping capabilities. Civil Maps has pioneered an innovative 3D mapping technique that is scalable and more efficient than existing processes. This provides Ford another way to develop high-resolution 3D maps of autonomous vehicle environments Silicon Valley expansion Ford also is expanding its Silicon Valley operations, creating a dedicated campus in Palo Alto. Adding two new buildings and 150,000 square feet of work and lab space adjacent to the current Research and Innovation Center, the expanded campus grows the company’s local footprint and supports plans to double the size of the Palo Alto team by the end of 2017. “Our presence in Silicon Valley has been integral to accelerating our learning and deliverables driving Ford Smart Mobility,” said Ken Washington, Ford vice president, Research and Advanced Engineering. “Our goal was to become a member of the community. Today, we are actively working with more than 40 startups, and have developed a strong collaboration with many incubators, allowing us to accelerate development of technologies and services.” Since the new Ford Research and Innovation Center Palo Alto opened in January 2015, the facility has rapidly grown to be one of the largest automotive manufacturer research centers in the region. Today, it is home to more than 130 researchers, engineers and scientists, who are increasing Ford’s collaboration with the Silicon Valley ecosystem. Research and Innovation Center Palo Alto’s multi-disciplinary research and innovation facility is the newest of nearly a dozen of Ford’s global research, innovation, IT and engineering centers. The expanded Palo Alto campus opens in mid-2017. View full article
-
2016 marks the 100th anniversary of BMW and to celebrate, the German automaker unveiled a concept that tries to predict the future of motoring. The Vision Next 100 concept features a number of new technologies in terms of how a vehicle is built, the interaction between driver and vehicle; and autonomous driving. Materials such as carbon fiber, plastic, and the residue from the production of carbon fiber make up the Vision Next 100's structure. Trademark BMW design cues such as the kidney grille and Hofmeister kink in the C-Pillar are apparent. BMW says the Vision Next 100's body is very aerodynamic with a drag coefficient of 0.18 cd. Inside the Vision Next 100 looks like something out of a sci-fi film with a dashboard that can extend or retract a steering wheel dependent on the mode the vehicle is in. BMW has also fitted a system called “Alive Geometry.” It uses 800 moving triangles in the instrument cluster and side panels that move as one to give signals to the driver. BMW explains the system as “a form of preconscious communication, where an intuitive signal predicts an imminent real-time event.” Two drive modes are on offer. First is Ease that moves the dash and center console to create a lounge environment for passengers. The Vision Next controls the vehicle and uses lighting in the grille, headlights, and taillights to let everyone know that it is in self-driving mode. Second is Boost mode that moves the dashboard forward and pops out a steering wheel for the driver. Alive Geometry can help the driver with finding the best line or warning about traffic problems. “If, as a designer, you are able to imagine something, there’s a good chance it could one day become reality. So our objective with the BMW VISION NEXT 100 was to develop a future scenario that people would engage with. Technology is going to make significant advances, opening up fantastic new possibilities that will allow us to offer the driver even more assistance for an even more intense driving experience," said Adrian van Hooydonk, Head of BMW Group Design in a statement. “My personal view is that technology should be as intuitive as possible to operate and experience so that future interactions between human, machine and surroundings become seamless. The BMW VISION NEXT 100 shows how we intend to shape this future.” Source: BMW Press Release is on Page 2 BMW VISION NEXT 100: Sheer Driving Pleasure of the future – What will it look like? Trying to image how we will live and get around in the future is as challenging as it is fascinating. How will social, economic and living conditions change? And what will be the impact on our mobility? What exciting new possibilities will new technologies bring? And what effect will digitalisation and connectivity have on our automotive requirements? To mark its centenary year in 2016, the BMW Group is looking further ahead than usual with a series of Vision Vehicles designed to anticipate and respond to people’s future mobility needs. Over the coming years, mobility will become increasingly diverse. In the not-too-distant future, most vehicles will probably be completely self-driving – people will get around in robots on wheels. So, given these developments, how will we justify the existence of vehicles by BMW, a brand for whom the individual and Sheer Driving Pleasure are the focus of everything? And how will BMW’s brand values translate into the future? In developing the BMW VISION NEXT 100, the main objective was to create not an anonymous vehicle but one that is highly personalised and fully geared to meet the driver’s every need – because the very emotional connection between a BMW and its driver is something we want to retain. For the BMW VISION NEXT 100, the design team specifically took into account all the trends and technological developments that will be most relevant to BMW in the decades ahead. But they also took many of their cues from innovations and designs of the past. The key factor throughout, however, was something that has always been typical of the BMW brand: the desire to be uncompromising in its future focus on technologies and customer value. Adrian van Hooydonk, Head of BMW Group Design: “If, as a designer, you are able to imagine something, there’s a good chance it could one day become reality. So our objective with the BMW VISION NEXT 100 was to develop a future scenario that people would engage with. Technology is going to make significant advances, opening up fantastic new possibilities that will allow us to offer the driver even more assistance for an even more intense driving experience. “My personal view is that technology should be as intuitive as possible to operate and experience so that future interactions between human, machine and surroundings become seamless. The BMW VISION NEXT 100 shows how we intend to shape this future.” Four proposals underpinning the BMW VISION NEXT 100: A genuine BMW is always driver-focused. In recent months and years, the greatest current trend in the automotive industry has become so widespread that it’s no longer a question of ‘if’ but ‘when’: autonomous driving. The BMW Group also believes that BMW drivers will be able to let their cars do the work – but only when the driver wants. The BMW VISION NEXT 100 remains a genuine BMW, offering an intense experience of Sheer Driving Pleasure. Artificial intelligence and intuitive technology become one. Moving into the future, vehicles will be fully connected, and digital technology will become so normal that it will permeate almost every area of our lives. Increasing digitalisation will lead to the physical and digital worlds merging more and more. Artificial intelligence will learn from us, anticipating many of our wishes and working away in the background to perform the jobs we delegate to it. The way humans and technologies interact will be transformed: screens and touchscreens will be replaced by more intuitive forms of human-machine communication and interaction. Better yet: technology will become more human. New materials open up breathtaking opportunities. In the future, how will cars be manufactured? At some point, presses that punch out hundreds of thousands of steel parts may well become obsolete – the use of carbon may already be a first indication of the sea-change that is imminent in the world of automotive materials and production. Technologies such as rapid manufacturing and 4D printing will produce not components or objects but intelligent, networked materials and could soon replace conventional tools to open up unimagined possibilities in design and engineering. Mobility will remain an emotional experience. Vehicles by BMW have never been purely utilitarian or merely a means of getting from one place to the next. Far more, a BMW is about looking to the next bend in the road, feeling the power of the engine and enjoying the sense of speed; it’s about the sensory experience, the adrenaline rush or that intimate moment at which a journey begins, be it for a lone driver or one travelling with a close friend or loved one. Moving into the future, that’s not set to change – because the emotional experience of mobility is firmly fixed in our collective corporate memory. By keeping the driver firmly in the foreground, the BMW VISION NEXT 100 will heighten this emotional experience in an unprecedented way. BMW VISION NEXT 100: A vehicle for future mobility. From driver to “Ultimate Driver” – through digital intelligence. “Alive Geometry” enables intuitive driver-vehicle interaction. “Boost” and “Ease” driving modes enable driver- or vehicle-controlled operation. “Companion”: The intelligent digital partner connects driver and car. Trademark BMW exterior. Materials of the future. From driver to Ultimate Driver – through digital intelligence. In the future, BMW drivers will still want to spend most of the time they are in their car at the wheel. In the BMW VISION NEXT 100, the driver will remain firmly in the focus, with constant connectivity, digital intelligence and state-of-the-art technologies available for support. But that’s not all: the BMW VISION NEXT 100 will turn the driver into the Ultimate Driver. So even though the world may well be changing, Sheer Driving Pleasure is here to stay – and will be more intense than ever before. In designing the BMW VISION NEXT 100, the starting point was the interior. In the years ahead, the driver’s wellbeing will become increasingly important, and rather than merely feeling they are in a machine that drives itself, they should sense that they are sitting in one that was specifically designed for them. This idea gave rise to an architecture in which the cab seems particularly spacious compared with the overall size of the vehicle while retaining the typical exterior lines of a BMW. Despite its domed interior, the BMW VISION NEXT 100 retains the instantly recognisable athletic silhouette of a BMW saloon. The design of the interior permits various modes of operation: Boost mode, in which the driver is at the controls, and Ease mode, in which the driver can sit back and let the vehicle take over. In Ease, the vehicle becomes a place of retreat with plenty of space, agreeable lighting and a comfortable atmosphere. In Boost, the driver takes over and benefits from the subtle and intuitive support offered by the vehicle. All the time, the vehicle is learning more and more about the person at the wheel, thanks to its sensory and digital intelligence, which the BMW Group calls the Companion. The Companion progressively learns to offer the right kind of support to transform the driver into the Ultimate Driver. A very important element of the Vision Vehicle is another innovation known as Alive Geometry, the likes of which have never before been seen in a car. It consists of a kind of three-dimensional sculpture that works both inside and outside the vehicle. Alive Geometry enables driver-vehicle interaction. Alive Geometry consists of almost 800 moving triangles which are set into the instrument panel and into certain areas of the side panels. They work in three dimensions, communicating very directly with the driver through their movements, which are more like gestures than two-dimensional depictions on a display. Even the slightest peripheral movement is perceptible to the driver. In combination with the Head-Up display, Alive Geometry uniquely fuses the analogue with the digital. The triangles work in much the same way as a flock of birds in controlled flight, their coordinated movements acting as signals that are easily comprehensible to those inside the car. Combined with the Head-Up display, they involve the driver in a form of preconscious communication, where an intuitive signal predicts an imminent real-time event. Various approaches can already be seen today that appear to confirm the feasibility of this solution. Rapid prototyping and rapid manufacturing, for example, are gaining importance all the time and are likely to be commonplace 30 years from now. Although at present it remains difficult to imagine how hundreds of tiny triangles could be coordinated to make Alive Geometry work, in the years ahead, it will be possible, as today’s standard vehicle manufacturing methods are replaced. In the future it will become feasible to produce far more complex and flexible forms. This is why, in the context of the BMW VISION NEXT 100, the BMW Group refers to 4D printing, a process which adds a fourth level to components: the functional one. In the years ahead, printed parts manufactured in this way will directly integrate functions which today have to be designed and produced separately before being incorporated into the whole. At the moment, the digital world is strongly linked to displays; the next step will be organic LEDs – in other words, displays that can freely change shape. However the Vision Vehicle suggests there will at some point be no more displays at all. Instead the entire windscreen will serve as a giant display, directly in front of the driver. In the future the digital and physical worlds will merge considerably, as is also expressed through Alive Geometry, for example, in the way the analogue dashboard interacts with the digital Head-Up Display in the front windscreen. Boost and Ease driving modes for driver- or vehicle-controlled operations. In Boost and Ease mode alike, the elements and technologies of the vehicle make for the most intense or relaxed driving experience, depending on what is required. Transitioning between modes is impressive and perfectly orchestrated, and Alive Geometry remains relevant throughout. In Boost, when the driver is concentrating fully on the road, Alive Geometry highlights the ideal driving line or possible turning point and warns of oncoming vehicles. Rather than making the driver drive faster, this kind of support sets out to make them drive noticeably better. In addition, intuitive feedback has a more physical and immediate impact than a robotic voice or instructions on a screen. In Ease mode, on the other hand, Alive Geometry is more discreet in its movements, informing occupants about the road ahead and any acceleration and braking manoeuvres that are about to happen. In Boost mode, the entire vehicle focuses on the driver, offering intelligent support to maximise the driving experience. The seat and steering wheel change position, and the centre console moves to become more strongly oriented toward the driver. As the journey proceeds, the driver can interact with the vehicle via gesture control. The contact analogue BMW Head-Up Display of the future uses the entire windscreen to communicate with the driver. In Boost mode, it focuses exclusively on what really matters to the driver: information such as the ideal line, turning point and speed. In addition, full connectivity, intelligent sensors and permanent data exchange allow the Head-Up Display to generate a digital image of the vehicle’s surroundings. In foggy conditions, for example, this means the driver can benefit from information such as vehicles crossing ahead, before they actually come into sight. In addition, by learning more and more about the driver, the system continuously improves, concentrating on creating at all times the most intense and personal driving experience possible. The transition to Ease mode brings about a complete change of interior ambience. The steering wheel and centre console retract and the headrests move to one side to create a relaxed and welcoming atmosphere. The seats and door panels merge to form a single unit, allowing the driver and passengers to sit at a slight angle. This makes it easier for them to face each other and sit in a more relaxed position for easier communications. Meanwhile, the Head-Up Display offers occupants personalised content along with the information and entertainment they desire. Depending on the driving mode, the focus of the vehicle changes, concentrating on essentials for the driver in Boost mode, and the surroundings and atmosphere in Ease mode, highlighting the impressive landscapes or buildings of interest that the car is passing by, for instance. Whether the vehicle is in Boost or Ease mode is also clearly apparent to other road users as the trademark kidney grille, double headlights and L-shaped rear lights act as communication tool. Their different colours of light indicate which mode the vehicle is currently in. Companion: The intelligent digital partner connects driver and car. The Companion is symbolised by a small sculptural element which represents the driver-vehicle connection. Shaped like a large, cut gemstone, it is positioned in the centre of the dashboard, just beneath the windscreen, where it symbolises the intelligence, connectivity and availability of the BMW VISION NEXT 100. It also represents the constant exchange of data: the more it learns about the owner and their mobility habits, the smarter it becomes. At some stage it knows the driver well enough to automatically perform routine tasks for them and offer suitable advice when needed. Irrespective of the vehicle itself, constant learning makes the Companion increasingly valuable to its owner. The Companion also plays an important role in driver-vehicle communications when the car transitions from Boost to Ease mode. While the driver concentrates on the road in Boost mode, the Companion remains flat in the dashboard. But when the BMW VISION NEXT 100 takes control in Ease, it rises up to create an interface with the windscreen. A signal light tells the driver that the car is ready for fully autonomous driving. For other road users, the Companion has a similar function, signalling through its own light as well as that of the vehicle that the car is operating in automated mode. In certain traffic situations, the Companion is in visual contact with other road users, helping pedestrians to cross the road by means of the green light gradient on the front of the vehicle. Trademark BMW exterior. The design of the BMW Vision Vehicle is characterised by a blend of coupé-type sportiness and the dynamic elegance of a sedan. At 4.90 meters long and 1.37 meters high, it has compact exterior dimensions. Inside, however, it has the dimensions of a luxury BMW sedan. The large wheels are positioned at the outer edges of the body, giving the vehicle the dynamic stance that is a trademark of BMW. When it comes to aerodynamics, exterior Alive Geometry contributes to an outstanding effect: when the wheels swivel as the vehicle is steered, the bodywork keeps them covered as if it were a flexible skin, accommodating their various positions. The innovative design of the BMW VISION NEXT 100 gives it an extremely low drag coefficient of 0.18. The exterior of the vehicle is copper in colour, designed to underscore the idea that BMW vehicles of the future should appear technical yet still have a warmth about them – as symbolised by the close links between the vehicle and its driver. This relationship begins as soon as the driver approaches the vehicle: intelligent sensor technologies automatically open its wing doors. To give the driver more space to enter and exit, the steering wheel is flush with the dashboard. Once seated, the full range of systems is activated by tapping on the BMW logo in the middle of the dashboard. The door closes, the steering wheel comes forward, and the driving experience begins. Materials of the future. The designers of the BMW VISION NEXT 100 primarily used fabrics made from recycled or renewable materials. The visible and non-visible carbon components, such as the side panels, are made from residues from normal carbon fibre production. In the future, the choice of materials will become even more important throughout the design and production process. With time, other new materials will also be added into the mix, allowing different vehicle shapes to emerge. To save resources and support more sustainable manufacturing, less use will be made of wood and leather while innovative materials and the consequent new possibilities in design and production gradually come to the fore. This approach is already being exemplified by the use of high-quality textiles and easily recyclable mono-materials and the elimination of leather in the interior of the BMW VISION NEXT 100.
- 4 comments
-
- BMW
- BMW Vision Next 100
-
(and 3 more)
Tagged with:
-
2016 marks the 100th anniversary of BMW and to celebrate, the German automaker unveiled a concept that tries to predict the future of motoring. The Vision Next 100 concept features a number of new technologies in terms of how a vehicle is built, the interaction between driver and vehicle; and autonomous driving. Materials such as carbon fiber, plastic, and the residue from the production of carbon fiber make up the Vision Next 100's structure. Trademark BMW design cues such as the kidney grille and Hofmeister kink in the C-Pillar are apparent. BMW says the Vision Next 100's body is very aerodynamic with a drag coefficient of 0.18 cd. Inside the Vision Next 100 looks like something out of a sci-fi film with a dashboard that can extend or retract a steering wheel dependent on the mode the vehicle is in. BMW has also fitted a system called “Alive Geometry.” It uses 800 moving triangles in the instrument cluster and side panels that move as one to give signals to the driver. BMW explains the system as “a form of preconscious communication, where an intuitive signal predicts an imminent real-time event.” Two drive modes are on offer. First is Ease that moves the dash and center console to create a lounge environment for passengers. The Vision Next controls the vehicle and uses lighting in the grille, headlights, and taillights to let everyone know that it is in self-driving mode. Second is Boost mode that moves the dashboard forward and pops out a steering wheel for the driver. Alive Geometry can help the driver with finding the best line or warning about traffic problems. “If, as a designer, you are able to imagine something, there’s a good chance it could one day become reality. So our objective with the BMW VISION NEXT 100 was to develop a future scenario that people would engage with. Technology is going to make significant advances, opening up fantastic new possibilities that will allow us to offer the driver even more assistance for an even more intense driving experience," said Adrian van Hooydonk, Head of BMW Group Design in a statement. “My personal view is that technology should be as intuitive as possible to operate and experience so that future interactions between human, machine and surroundings become seamless. The BMW VISION NEXT 100 shows how we intend to shape this future.” Source: BMW Press Release is on Page 2 BMW VISION NEXT 100: Sheer Driving Pleasure of the future – What will it look like? Trying to image how we will live and get around in the future is as challenging as it is fascinating. How will social, economic and living conditions change? And what will be the impact on our mobility? What exciting new possibilities will new technologies bring? And what effect will digitalisation and connectivity have on our automotive requirements? To mark its centenary year in 2016, the BMW Group is looking further ahead than usual with a series of Vision Vehicles designed to anticipate and respond to people’s future mobility needs. Over the coming years, mobility will become increasingly diverse. In the not-too-distant future, most vehicles will probably be completely self-driving – people will get around in robots on wheels. So, given these developments, how will we justify the existence of vehicles by BMW, a brand for whom the individual and Sheer Driving Pleasure are the focus of everything? And how will BMW’s brand values translate into the future? In developing the BMW VISION NEXT 100, the main objective was to create not an anonymous vehicle but one that is highly personalised and fully geared to meet the driver’s every need – because the very emotional connection between a BMW and its driver is something we want to retain. For the BMW VISION NEXT 100, the design team specifically took into account all the trends and technological developments that will be most relevant to BMW in the decades ahead. But they also took many of their cues from innovations and designs of the past. The key factor throughout, however, was something that has always been typical of the BMW brand: the desire to be uncompromising in its future focus on technologies and customer value. Adrian van Hooydonk, Head of BMW Group Design: “If, as a designer, you are able to imagine something, there’s a good chance it could one day become reality. So our objective with the BMW VISION NEXT 100 was to develop a future scenario that people would engage with. Technology is going to make significant advances, opening up fantastic new possibilities that will allow us to offer the driver even more assistance for an even more intense driving experience. “My personal view is that technology should be as intuitive as possible to operate and experience so that future interactions between human, machine and surroundings become seamless. The BMW VISION NEXT 100 shows how we intend to shape this future.” Four proposals underpinning the BMW VISION NEXT 100: A genuine BMW is always driver-focused. In recent months and years, the greatest current trend in the automotive industry has become so widespread that it’s no longer a question of ‘if’ but ‘when’: autonomous driving. The BMW Group also believes that BMW drivers will be able to let their cars do the work – but only when the driver wants. The BMW VISION NEXT 100 remains a genuine BMW, offering an intense experience of Sheer Driving Pleasure. Artificial intelligence and intuitive technology become one. Moving into the future, vehicles will be fully connected, and digital technology will become so normal that it will permeate almost every area of our lives. Increasing digitalisation will lead to the physical and digital worlds merging more and more. Artificial intelligence will learn from us, anticipating many of our wishes and working away in the background to perform the jobs we delegate to it. The way humans and technologies interact will be transformed: screens and touchscreens will be replaced by more intuitive forms of human-machine communication and interaction. Better yet: technology will become more human. New materials open up breathtaking opportunities. In the future, how will cars be manufactured? At some point, presses that punch out hundreds of thousands of steel parts may well become obsolete – the use of carbon may already be a first indication of the sea-change that is imminent in the world of automotive materials and production. Technologies such as rapid manufacturing and 4D printing will produce not components or objects but intelligent, networked materials and could soon replace conventional tools to open up unimagined possibilities in design and engineering. Mobility will remain an emotional experience. Vehicles by BMW have never been purely utilitarian or merely a means of getting from one place to the next. Far more, a BMW is about looking to the next bend in the road, feeling the power of the engine and enjoying the sense of speed; it’s about the sensory experience, the adrenaline rush or that intimate moment at which a journey begins, be it for a lone driver or one travelling with a close friend or loved one. Moving into the future, that’s not set to change – because the emotional experience of mobility is firmly fixed in our collective corporate memory. By keeping the driver firmly in the foreground, the BMW VISION NEXT 100 will heighten this emotional experience in an unprecedented way. BMW VISION NEXT 100: A vehicle for future mobility. From driver to “Ultimate Driver” – through digital intelligence. “Alive Geometry” enables intuitive driver-vehicle interaction. “Boost” and “Ease” driving modes enable driver- or vehicle-controlled operation. “Companion”: The intelligent digital partner connects driver and car. Trademark BMW exterior. Materials of the future. From driver to Ultimate Driver – through digital intelligence. In the future, BMW drivers will still want to spend most of the time they are in their car at the wheel. In the BMW VISION NEXT 100, the driver will remain firmly in the focus, with constant connectivity, digital intelligence and state-of-the-art technologies available for support. But that’s not all: the BMW VISION NEXT 100 will turn the driver into the Ultimate Driver. So even though the world may well be changing, Sheer Driving Pleasure is here to stay – and will be more intense than ever before. In designing the BMW VISION NEXT 100, the starting point was the interior. In the years ahead, the driver’s wellbeing will become increasingly important, and rather than merely feeling they are in a machine that drives itself, they should sense that they are sitting in one that was specifically designed for them. This idea gave rise to an architecture in which the cab seems particularly spacious compared with the overall size of the vehicle while retaining the typical exterior lines of a BMW. Despite its domed interior, the BMW VISION NEXT 100 retains the instantly recognisable athletic silhouette of a BMW saloon. The design of the interior permits various modes of operation: Boost mode, in which the driver is at the controls, and Ease mode, in which the driver can sit back and let the vehicle take over. In Ease, the vehicle becomes a place of retreat with plenty of space, agreeable lighting and a comfortable atmosphere. In Boost, the driver takes over and benefits from the subtle and intuitive support offered by the vehicle. All the time, the vehicle is learning more and more about the person at the wheel, thanks to its sensory and digital intelligence, which the BMW Group calls the Companion. The Companion progressively learns to offer the right kind of support to transform the driver into the Ultimate Driver. A very important element of the Vision Vehicle is another innovation known as Alive Geometry, the likes of which have never before been seen in a car. It consists of a kind of three-dimensional sculpture that works both inside and outside the vehicle. Alive Geometry enables driver-vehicle interaction. Alive Geometry consists of almost 800 moving triangles which are set into the instrument panel and into certain areas of the side panels. They work in three dimensions, communicating very directly with the driver through their movements, which are more like gestures than two-dimensional depictions on a display. Even the slightest peripheral movement is perceptible to the driver. In combination with the Head-Up display, Alive Geometry uniquely fuses the analogue with the digital. The triangles work in much the same way as a flock of birds in controlled flight, their coordinated movements acting as signals that are easily comprehensible to those inside the car. Combined with the Head-Up display, they involve the driver in a form of preconscious communication, where an intuitive signal predicts an imminent real-time event. Various approaches can already be seen today that appear to confirm the feasibility of this solution. Rapid prototyping and rapid manufacturing, for example, are gaining importance all the time and are likely to be commonplace 30 years from now. Although at present it remains difficult to imagine how hundreds of tiny triangles could be coordinated to make Alive Geometry work, in the years ahead, it will be possible, as today’s standard vehicle manufacturing methods are replaced. In the future it will become feasible to produce far more complex and flexible forms. This is why, in the context of the BMW VISION NEXT 100, the BMW Group refers to 4D printing, a process which adds a fourth level to components: the functional one. In the years ahead, printed parts manufactured in this way will directly integrate functions which today have to be designed and produced separately before being incorporated into the whole. At the moment, the digital world is strongly linked to displays; the next step will be organic LEDs – in other words, displays that can freely change shape. However the Vision Vehicle suggests there will at some point be no more displays at all. Instead the entire windscreen will serve as a giant display, directly in front of the driver. In the future the digital and physical worlds will merge considerably, as is also expressed through Alive Geometry, for example, in the way the analogue dashboard interacts with the digital Head-Up Display in the front windscreen. Boost and Ease driving modes for driver- or vehicle-controlled operations. In Boost and Ease mode alike, the elements and technologies of the vehicle make for the most intense or relaxed driving experience, depending on what is required. Transitioning between modes is impressive and perfectly orchestrated, and Alive Geometry remains relevant throughout. In Boost, when the driver is concentrating fully on the road, Alive Geometry highlights the ideal driving line or possible turning point and warns of oncoming vehicles. Rather than making the driver drive faster, this kind of support sets out to make them drive noticeably better. In addition, intuitive feedback has a more physical and immediate impact than a robotic voice or instructions on a screen. In Ease mode, on the other hand, Alive Geometry is more discreet in its movements, informing occupants about the road ahead and any acceleration and braking manoeuvres that are about to happen. In Boost mode, the entire vehicle focuses on the driver, offering intelligent support to maximise the driving experience. The seat and steering wheel change position, and the centre console moves to become more strongly oriented toward the driver. As the journey proceeds, the driver can interact with the vehicle via gesture control. The contact analogue BMW Head-Up Display of the future uses the entire windscreen to communicate with the driver. In Boost mode, it focuses exclusively on what really matters to the driver: information such as the ideal line, turning point and speed. In addition, full connectivity, intelligent sensors and permanent data exchange allow the Head-Up Display to generate a digital image of the vehicle’s surroundings. In foggy conditions, for example, this means the driver can benefit from information such as vehicles crossing ahead, before they actually come into sight. In addition, by learning more and more about the driver, the system continuously improves, concentrating on creating at all times the most intense and personal driving experience possible. The transition to Ease mode brings about a complete change of interior ambience. The steering wheel and centre console retract and the headrests move to one side to create a relaxed and welcoming atmosphere. The seats and door panels merge to form a single unit, allowing the driver and passengers to sit at a slight angle. This makes it easier for them to face each other and sit in a more relaxed position for easier communications. Meanwhile, the Head-Up Display offers occupants personalised content along with the information and entertainment they desire. Depending on the driving mode, the focus of the vehicle changes, concentrating on essentials for the driver in Boost mode, and the surroundings and atmosphere in Ease mode, highlighting the impressive landscapes or buildings of interest that the car is passing by, for instance. Whether the vehicle is in Boost or Ease mode is also clearly apparent to other road users as the trademark kidney grille, double headlights and L-shaped rear lights act as communication tool. Their different colours of light indicate which mode the vehicle is currently in. Companion: The intelligent digital partner connects driver and car. The Companion is symbolised by a small sculptural element which represents the driver-vehicle connection. Shaped like a large, cut gemstone, it is positioned in the centre of the dashboard, just beneath the windscreen, where it symbolises the intelligence, connectivity and availability of the BMW VISION NEXT 100. It also represents the constant exchange of data: the more it learns about the owner and their mobility habits, the smarter it becomes. At some stage it knows the driver well enough to automatically perform routine tasks for them and offer suitable advice when needed. Irrespective of the vehicle itself, constant learning makes the Companion increasingly valuable to its owner. The Companion also plays an important role in driver-vehicle communications when the car transitions from Boost to Ease mode. While the driver concentrates on the road in Boost mode, the Companion remains flat in the dashboard. But when the BMW VISION NEXT 100 takes control in Ease, it rises up to create an interface with the windscreen. A signal light tells the driver that the car is ready for fully autonomous driving. For other road users, the Companion has a similar function, signalling through its own light as well as that of the vehicle that the car is operating in automated mode. In certain traffic situations, the Companion is in visual contact with other road users, helping pedestrians to cross the road by means of the green light gradient on the front of the vehicle. Trademark BMW exterior. The design of the BMW Vision Vehicle is characterised by a blend of coupé-type sportiness and the dynamic elegance of a sedan. At 4.90 meters long and 1.37 meters high, it has compact exterior dimensions. Inside, however, it has the dimensions of a luxury BMW sedan. The large wheels are positioned at the outer edges of the body, giving the vehicle the dynamic stance that is a trademark of BMW. When it comes to aerodynamics, exterior Alive Geometry contributes to an outstanding effect: when the wheels swivel as the vehicle is steered, the bodywork keeps them covered as if it were a flexible skin, accommodating their various positions. The innovative design of the BMW VISION NEXT 100 gives it an extremely low drag coefficient of 0.18. The exterior of the vehicle is copper in colour, designed to underscore the idea that BMW vehicles of the future should appear technical yet still have a warmth about them – as symbolised by the close links between the vehicle and its driver. This relationship begins as soon as the driver approaches the vehicle: intelligent sensor technologies automatically open its wing doors. To give the driver more space to enter and exit, the steering wheel is flush with the dashboard. Once seated, the full range of systems is activated by tapping on the BMW logo in the middle of the dashboard. The door closes, the steering wheel comes forward, and the driving experience begins. Materials of the future. The designers of the BMW VISION NEXT 100 primarily used fabrics made from recycled or renewable materials. The visible and non-visible carbon components, such as the side panels, are made from residues from normal carbon fibre production. In the future, the choice of materials will become even more important throughout the design and production process. With time, other new materials will also be added into the mix, allowing different vehicle shapes to emerge. To save resources and support more sustainable manufacturing, less use will be made of wood and leather while innovative materials and the consequent new possibilities in design and production gradually come to the fore. This approach is already being exemplified by the use of high-quality textiles and easily recyclable mono-materials and the elimination of leather in the interior of the BMW VISION NEXT 100. View full article
- 4 replies
-
- BMW
- BMW Vision Next 100
-
(and 3 more)
Tagged with: