Despite US dominance in so many different areas of technology, we’re sadly somewhat of a backwater when it comes to car headlamps. It’s been this way for many decades, a result of restrictive federal vehicle regulations that get updated rarely. The latest lights to try to work their way through red tape and onto the road are active-matrix LED lamps, which can shape their beams to avoid blinding oncoming drivers.

From the 1960s, Federal Motor Vehicle Safety Standards allowed for only sealed high- and low-beam headlamps, and as a result, automakers like Mercedes-Benz would sell cars with less capable lighting in North America than it offered to European customers.

A decade ago, this was still the case. In 2014, Audi tried unsuccessfully to bring its new laser high-beam technology to US roads. Developed in the racing crucible that is the 24 Hours of Le Mans, the laser lights illuminate much farther down the road than the high beams of the time, but in this case, the lighting tech had to satisfy both the National Highway Traffic Safety Administration and the Food and Drug Administration, which has regulatory oversight for any laser products.

The good news is that by 2019, laser high beams were finally an available option on US roads, albeit once the power got turned down to reduce their range.

NHTSA’s opposition to advanced lighting tech is not entirely misplaced. Obviously, being able to see far down the road at night is a good thing for a driver. On the other hand, being dazzled or blinded by the bright headlights of an approaching driver is categorically not a good thing. Nor is losing your night vision to the glare of a car (it’s always a pickup) behind you with too-bright lights that fill your mirrors.

This is where active-matrix LED high beams come in, which use clusters of controllable LED pixels. Think of it like a more advanced version of the “auto high beam” function found on many newer cars, which uses a car’s forward-looking sensors to know when to dim the lights and when to leave the high beams on.

Here, sensor data is used much more granularly. Instead of turning off the entire high beam, the car only turns off individual pixels, so the roadway is still illuminated, but a car a few hundred feet up the road won’t be.

Rather than design entirely new headlight clusters for the US, most OEMs’ solution was to offer the hardware here but disable the beam-shaping function—easy to do when it’s just software. But in 2022, NHTSA relented—nine years after Toyota first asked the regulator to reconsider its stance.

  • @[email protected]
    link
    fedilink
    English
    1
    edit-2
    9 hours ago

    Honestly, while it does add more moving parts into the equation, I wonder how realistic it is to just take the plunge and go with driving a car via display rather than directly looking through glass.

    Ever since we got cars with a pane of glass up front rather than goggles, that’s been kind of the standard way that motorists operated – look through a big sheet of glass.

    But that was also a system that was developed based on technology from around 1900.

    There have been displays that provide heads-up augmentation of stuff, projected on the windshield. But end of the day, maybe instead of augmenting vision, it’s time to just outright go with displays.

    I mean, it’s more moving parts, but you’ve got a lot of moving parts, computers and such, already involved in controlling your car now.

    Maybe shining really bright lights in front of a car as a way to see at night when traveling at high speed is getting obsolete.

    • As people get older, their eyes inevitably don’t adjust as quickly to darkness after being hit by a bright light. Can’t do much about that short of swapping out all headlights on older cars, even if you produce a new standard. It takes decades and decades to age that out. But this provides an immediate benefit to users of such displays.

    • We had an thread up the other day on [email protected] talking about how people driving tall vehicles that have poor visibility in front create some risks for kids. You can put cameras wherever you want.

    • If you’re looking out a window, you have some blind spots due to the roof support beams. Doesn’t need to exist with a display.

    • Other people in the car don’t need to obstruct one’s view.

    • We’ve gotten increasingly-compelling systems that can process data from the outside world. Here’s a video clip from ENVG-B. That’s a US military system that can do things like do edge-detection and highlighting – I believe it aims to specifically detect and highlight humans – see into the infrared, prevents people from being blinded by flashes (which is a particular problem for the military, with muzzle flash and explosions and such).

    • While there is some competition with self-driving cars (if I never drive my car, if the computer drives it, then I don’t need to see to drive it) there’s a lot of overlap in the problems to solve. For self-driving cars, the car has to be able to generate a 3d model of the world around itself with sensors and such. That’s also the same data that you’d need to be obtaining, processing, and providing a human driver with if you wanted to provide an display of the world around oneself.

    • You can leverage sensor fusion, combining data from many different sensors. LIDAR, millimeter-wave radar, numerous cameras, hyperspectral imaging, light polarity-sensitive sensors.

    • While our eyes are pretty good, there are some environments that we run into, like dense fog or rapid transitions in brightness, that they just aren’t all that great at dealing with compared to the sensors that we have.

    • If you have the driver driving via display, a lot of constraints on where you place them in the vehicle go away. You don’t have the left-hand/right-hand split with vehicles, or even need to have the driver sitting at the front of the car. You can make a vehicle a lot shorter, and still potentially provide for pretty good visibility – and I understand that wanting more visibility is part of why people buy taller vehicles.

    • Windows aren’t great in terms of thermal insulation, especially single-pane car windows. If we didn’t have to have a lot of a car’s sides covered in glass, we wouldn’t need to spend as much energy on climate control.

    • Windows – though newer ones have improved on this – let in a fair amount of solar energy. Be nice to not have the “greenhouse effect” with a hot car that’s been parked in a parking lot in summer.

    • You can provide for more privacy if people can’t just see into cars. Some people tint vehicles for this reason, but tint comes with visibility drawbacks.

    • It’s not obvious that a parked car contains something valuable left in it, and you can make a car a lot more secure if someone can’t just smash in a window to get in.

    • Laminated car windshields are pretty safe and durable compared to their early forms, but they’re still a weak point in terms of safety; people have had rocks go through them and whack a driver.

    • One situation that I recall reading about that apparently has a nasty tendency to hit epileptics is driving down tree-shaded avenues; that can produce the regular flashing at about the right frequency to trigger seizures. If you’ve got a computer in the middle, it can filter that out.

    One downside is that I’m not really happy with the present state of computer-integrated cars in terms of privacy. Like, it’s technically doable to build a system like this without privacy implications – a computer gathering data doesn’t need to mean that that data goes to anyone else – but the track record car manufacturers have here is not good. I don’t want to buy a car with sensors that can measure everything around me if what it’s going to do with that data is to then have the manufacturer try to figure out to make money from that data.

    Another is that cars tend to have longer lives than do computers, as automotive technology hasn’t moved as quickly. As things stand today, you can’t really upgrade the computer in a car, much less sensors. A thirty-year-old car from 1994 might be perfectly driveable in 2024, but if we built a computer into it back in 1994, it’d have long-outdated electronics. My guess is that the kind of view of the world we could provide in 2054, thirty years from now, is gonna be a lot better than the view we can provide in 2024. I don’t really want to throw out a car in order to get a newer car computer and sensors. That’s not a fundamental problem – it’d be possible to make cars that have computer systems and sensors that can be replaced – but the economics would need to make sense.

    • TimeSquirrel
      link
      fedilink
      68 hours ago

      What’s the backup for when the cams burn out or the screen goes kaput while you’re going 70mph? All those electronic systems in modern cars have backups. If power steering goes out, you can still turn the wheel with increased effort. If backup cam goes out, you have rear view mirrors. If the brakes fail, you have a manual cable-driven e-brake.

      I don’t trust man-made technology as far as I can throw it. I want redundancy.

      • @[email protected]
        link
        fedilink
        English
        16 hours ago

        You wouldn’t like the cybertruck then, completely drive by wire steering, no physical connection.

        • TimeSquirrel
          link
          fedilink
          16 hours ago

          Nope, never will set foot in one. If the Cybertruck and similar future vehicles were under strict regulations and maintenance schedules like what the FAA does for airliners, which are also fly by wire, then I might consider it. Right now all I have to go on is trusting Elon’s company and his engineers directly. And they seem to just be wanting to create a flashy cell phone on wheels full of gimmicks, not any sort of dependable vehicle.

    • dual_sport_dork 🐧🗡️
      link
      English
      39 hours ago

      It sounds like describing Ng’s tank from Snow Crash.

      …I want to drive Ng’s tank from Snow Crash.

      • @[email protected]
        link
        fedilink
        English
        19 hours ago

        Ng drove his vehicle from a VR interface:

        Ng himself, or at least, Ng’s avatar, is a small, very dapper Vietnamese man in his fifties, hair plastered to his head, wearing military-style khakis. At the time Y.T. comes into his office, he is leaning forward in his chair, getting his shoulders rubbed by a geisha.

        But it is a very strange thing to do, for one reason: The geisha is just a picture on Ng’s goggles, and on Y.T.'s. And you can’t get a massage from a picture. So why bother?

        Ng sits back down and the geisha goes right back to it. Ng’s desk is a nice French antique with a row of small television monitors along the back edge, facing toward him. He spends most of his time watching the monitors, even when he is talking.

        Y.T. gets up and walks around behind his desk to look.

        Each of the little TV monitors is showing a different view out his van: windshield, left window, right window, rearview. Another one has an electronic map showing his position: inbound on the San Bernardino, not far away.

        “The van is under voice command,” he explains. “I removed the steering-wheel-and-pedal interface because I found verbal commands more convenient. This is why I will sometimes make unfamiliar sounds with my voice – I am controlling the vehicle’s systems.”

        Recognizing his van is easy enough. It is enormous. It is eight feet high and wider than it is high, which would have made it a wide load in the old days when they had laws. The construction is boxy and angular. it has been welded together out of the type of flat, dimpled steel plate usually used to make manhole lids and stair treads. The tires are huge, like tractor tires with a more subtle tread, and there are six of them: two axles in back and one in front. The engine is so big that, like an evil spaceship in a movie, Y.T. feels its rumbling in her ribs before she can see it; it is kicking out diesel exhaust through a pair of squat vertical red smokestacks that project from the roof, toward the rear. The windshield is a perfectly flat rectangle of glass about three by eight feet, smoked so black that Y.T. can’t make out an outline of anything inside. The snout of the van is festooned with every type of high-powered light known to science, like this guy hit a New South Africa franchise on a Saturday night and stole every light off every roll bar, and a grille has been constructed across the front, welded together out of rails torn out of an abandoned railroad somewhere. The grille alone probably weighs more than a small car.

        “I tried prostheses for a while – some of them are very good. But nothing is as good as a motorized wheelchair. And then I got to thinking, why do motorized wheelchairs always have to be tiny pathetic things that strain to go up a little teeny ramp? So I bought this – it is an airport firetruck from Germany – and converted it into my new motorized wheelchair.”

        “America is wonderful because you can get anything on a drive-through basis. Oil change, liquor, banking, car wash, funerals, anything you want – drive through! So this vehicle is much better than a tiny pathetic wheelchair. It is an extension of my body.”

        “When the geisha rubs your back?”

        Ng mumbles something and his pouch begins to throb and undulate around his body. “She is a daemon, of course. As for the massage, my body is suspended in an electrocontractive gel that massages me when I need it. I also have a Swedish girl and an African woman, but those daemons are not as well rendered.”

        “And the mint julep?”

        “Through a feeding tube. Nonalcoholic, ha ha.”

        That might be a bit of a jump past where we are today technologically, since he’s got a tactile-feedback rig for VR and the car driving under voice control. But, yeah, it’d be a pretty capable vehicle.