One thing I'd suggest, for any hardware product, is that when doing your bill of materials to provide links and show estimated costs. Sure, these will change but having a rough idea of the costs is really helpful, especially when perusing on from things like HN. It can be a big difference for someone to decide if they want to try it on their own or not. It is the ballpark figures that matter, not the specifics.
You did all that research, write it down. If for no one but yourself! Providing links is highly helpful because names can be funky and helps people (including your future self) know if this is the same thing or not. It's always noisy, but these things reduce noise. Importantly, they take no time while you're doing the project (you literally bought the parts, so you have the link and the price). It saves yourself a lot of hassle, not just for others. Document because no one remembers anything after a few days or weeks. It takes 10 seconds to write it down and 30 minutes to do the thing all over again, so be lazy and document. I think this is one of the biggest lessons I learned when I started as an engineer. You save yourself so much time. You just got to fight that dumb part in your head that is trying to convince you that it doesn't save time. (Same with documenting code[0])
Here. I did a quick "15 minute" look. May not be accurate
Lidar:
One of:
LD06: $80 https://www.aliexpress.us/item/3256803352905216.html
LD19: $70 https://www.amazon.com/DTOF-D300-Distance-Obstacle-Education/dp/B0B1V8D36H
STL27L: $160 https://www.dfrobot.com/product-2726.html
Camera and Lens: $60 https://www.amazon.com/Arducam-Raspberry-Camera-Distortion-Compatible/dp/B0B1MN721K
Raspberry Pi 4: $50
NEMA17 42-23 stepper: $10 https://www.amazon.com/SIMAX3D-Nema17-Stepper-Motor/dp/B0CQLFNSMJ
That gives us $200-$280 before counting the power supply and buck converter.
[0] When I wrote the code only me and god understood what was going on. But as time marched on, now only god knows.
Learning projects like this is about to get a lot less accessible due to the extreme tariffs and elimination of the de minimis exemption. Take that BOM and multiply it by 2X or 3X depending on the source and how many different shipments arrived.
I can’t tell you how depressing it is to go from having access to cheap learning materials for introducing kids (and adults) to electronics, and now it’s being taxed away in the name of improving the US competitiveness or something. Total footgun.
I’ve been jokingly calling my electronics hobby of the past few years “getting my EE degree.”
Putting together PCBs, reading and replicating schematics, designing my own hardware. It’s been really fun and has paid dividends for my career (firmware). If the US is interested in bringing knowledge of manufacturing back this is a very very bad way to do it. How many undergrad projects are now impossible because the BOM has quadrupled? How many future mechanical/electrical engineers are not going to get into hardware because of this?
It’s all going to be gone. I’ve spent roughly $1200 last year having PCBs and ordering sensors etc. that goes to $4000 with these executive orders.
There is a small part of that wonders if it is by design given how much it exposes a person to the inner workings of the things they use daily, but then I bring myself back to reality. I sincerely doubt now there is a real long term plan at work here. At best it is all situational and reactive ( tactical, maybe, but not strategic thinking at work ).
Both can be true. There was incentive to prevent people from building, repairing, and modifying things. It need not be malicious with the intent to prevent people from getting into the field. Instead it only need be naïve and selfish. Sony suing because people put Linux on their PS3. Remember these people will also try to sue white hat hackers who submit vulnerability reports. It's not a long term plan to destroy, instead it's an extremely short sighted and naïve plan to build a wall and protect.
You'll find that naïve shortsightedness often is indistinguishable from malicious foresight, when looked back upon. So remember how to stop it: think of the little things, the compounding effects. There's always costs and trades being made, there's no free lunch. As the world gets more advanced it gets more complex. The more complex it gets the more those little subtle things matter
Side hobby insantities not limited to 'ee degree hobbies'! -> Many companies got started as 'side ventures' aka Facebook, MS Windows, Dell, Nerf and multitude of other companies.
Use VR / simuations to cut down on Milton Hershey[1] cost issues. Simulations can allow for 'timing the tarrifs'.
North American FPGA's may be a thing again! (once pi systems superceeds lambda systems as the multi-FPGA board connection theory of choice :-)
Save on robotic automation costs via software scripts!
Perhaps 3d printed circuit boards will now be a more realisitic alterntative than traditional outsourcing board production? [2][3][4]
Perhaps there'll be some motivation/inspiration for creating/combining standard programming logic/language/3d 'threaded' sock it programming by combining (2021) mathematics of knitting[5]; and more modern / inexpensive / upgraded 'general computational device' derived from apollo guidence computer[6] concept. aka instead of wasm, weave-em.
Do have to get over the 'stich in time, saves nine', since most modern computation 'saves in 8'. Easy to unravel program(s). Tight knit spreadsheets!
But, computational e-ink might scale better using DBOS concepts[7].
Doesn't change the fact that the advice is still beneficial. At worst you still have a good history of the effect of these tariffs.
I'd call the tariffs the second death of hardware though. The first was when we killed all the parts stores. That was a slower death, coupled with the loss of right to repair. But we've been making big strides in that domain, so I hope we can undo that death. If we can also undo the dumb tariffs too then ironically we might have a chance to bring back hardware which somewhat seems inline with what that party (claims to/pretends to) wants.
It is not just a question of popularity. Post 9/11 skillsets around chemical ( or other scary looking domains ) became vilified within general populace and then that vilification was normalized with shows like 24 ( which is a great show... it just sucks that people are unable to separate the world it presents from their world ). It does not help with the advent of red flag laws, where neighbors can effectively call cops on you if they see something that bothers them..
I'm not aware of the chemical hobby era. And as I watched it happen, the electronic hobby era died at the same time as the mechanical. Most of the mechanical one was cars. It happened as it became harder to repair things. The barrier to entry increased, the availability of parts decreased with decreasing demand. It is a negative feedback loop.
Mechanical: I was thinking more Meccano and home built steam engines - cars came later.
Chemical: I was thinking chemistry sets and the age when Chemistry was geeky cool (Du Pont, Uncle Tungsten).
My grand father was a structural engineer, my father studied chemical engineering, I studied electronic engineering (but got a job programming). So perhaps I was thinking more of the frontier of engineering shifting rather than technical hobbies.
It would be interesting to look at geeky hobby adverts in a magazine over time and see how the advert focus shifted.
I don't think those chemistry sets were ever widely popular. We see them and look back thinking how cool and dangerous, but every one I'm aware of was insanely expensive. Same with the atomic kit.
On the other hand, most mechanical things were relatively cheap. Cars were worked on since cars existed. People also worked on everything in their homes. There was the tradesman who takes a specific skill like being able to fix a washing machine, but a lot of these people had hobbies of building and making.
So we have a direct connection to the death of the repairman and death of the handyman. These were quite popular things even up through the 90's. You'll even see this in shows and movies.
Plus, you forgot the most popular hobby of them all: woodworking. Still alive, but nowhere near as popular as a few decades ago.
> Most of the mechanical one was cars. It happened as it became harder to repair things.
I don't think it has become harder to repair cars though. Most problems that need repair are the same old bushings, brakes, spindles, rust, bearings ...
I think it is some cultural change away from handy work in general.
Electronics have become harder to repair with smaller and more integrated circuits though.
A new Radio Shack dealer just opened in a rural area, with an emphasis on radio (it's near an off-roading park), and your comment reminded me what terrible timing this is for them. Such a cruel twist.
This is not correct. There absolutely is a De Minimis exemption for tariffs. You always have to pay VAT, which you didn't always have to do before. You also don't have to pay that fee, you can order from a store that supports IOSS. The big ones do nowadays. You can also choose to declare the package yourself, but you need to live close to a tariff declaration office for this to be feasible. You can also have it delivered through a different delivery company, they usually declare it for you.
Assuming this wasn’t sarcasm: For many hobbies the parts come from all over the world. You can’t expect hobbyists in every country to set up manufacturing for every part.
Even if they did, the raw materials have to come from other countries. The machines probably come from other countries. Setting up little factories all over the world isn’t efficient so prices would be extremely high. Parts might be cheaper importing from other countries even with extreme tariffs.
It’s all just a mess of bad policy. We lose out when governments restrict our ability to make small, simple purchases from other countries without heavy cost overhead.
Sweden as an example is 10 million people. The world is 8000 million, 800x larger. The affluent parts of the world are somewhere between 100-1000 million, so the market potential is 10-100x larger. If one is able to address the global market, the economy of scale around manufacturing will be heavily in favor of that. This is particularly true for hobbyist stuff, because it tends to be both (relatively) low volume and low margin. And also price sensitive, in that people might just adjust their hobbies based on what is affordable/not.
This is just demand side. Of course producing in Sweden will be more expensive than in China - for electromechanical things, Shenzhen is likely better on every single metric...
I wonder how that additional cost breaks down. Is it mostly cost of labor? Supply chain access? Environmental controls and compliance? Other overheads not present in China? Is economically viable production possible in the US?
For hobby parts? It’s not viable because you’d be setting up a manufacturing operation to serve a small number of people. The fixed costs would be so high you’d never get it back.
Contrast that with someone setting up an operation to serve the entire world, a market 1000 times larger than many localities.
I know what a buck converter is but I wonder why it's called 'buck converter'. A dictionary doesn't seem to have any words that mean roughly "reducing the amount of something". [0]
Similarly, Wikipedia doesn't elaborate on the etymology. [1]
What is the HN opinion on Tesla skipping lidar? Having spent some time with computer vision in university I think it's insane to skip it - sure stereo reconstruction is powerful but lighting conditions have such an impact on results that having some robust depth data feels like a no-brainer and skipping it feels like malignant neglect.
As someone who's done a lot of computer vision, it is insane to skip it. And it's sad because what everyone missed from that viral Mark Rober video [0] was not the Looney Toons wall hit but the fucking kid in the smoke. Add all the cameras and AI you want, you ain't changing the laws of physics: visible light doesn't penetrate smoke. But radar does. Every (traditional) engineer knows that safe systems have redundancy. That safe systems have redundancy through differing modalities. Use cameras, but also use radar, lidar, and even millimeter wave. Using just cameras isn't just tying one hand behind your back, it's shooting yourself in the kneecap afterwards
The argument is that humans manage the task without lidar, and automation doesn't have to be perfect it just has to be better than humans, to be a net positive. It seems to me, you might as well use lidar if it's cheap enough, but the argument that computer systems can outcompete human drivers, without using lidar, is at least reasonable, although not yet proven.
We could extend the argument more. Why build a self driving vehicle at all? Build a humanoid robot to drive the car for you! The argument that computer systems can outcompete human drivers, without using lidar, is at least reasonable, although not yet proven
(I didn't just want to just make sure - this is a stab)
It's a dumb argument on multiple accounts. While it's a routine argument in software engineering it's an argument that often will get people fired and sued in testimonial.
First off, humans don't do it "just by vision". Sure, we don't have lidar but we have hearing, we have touch, we have tons of experience. We can create world models for Christ's sake and that means modeling physics. I'm sure you've seen papers that claim world models but I'm a ML researcher who also has a physics degree and I'm not afraid to tell you that's bullshit. It's as honest as Altman calling GPT PhD level intelligence. A PhD has very little to do with the ability to recall information.
Second off, it doesn't matter much how humans do it. It matters how the car can. Why limit yourself. There's tons of cars with radar and lidar. They're not more expensive and they can see an object in fog or poor light conditions. It can do something humans can't do! Why in the world would you decide not to do that. You can make an argument about price but that argument changes when that thing becomes cheaper. When that happens you're now just someone adding danger for no reason. You can't argue that only cameras will be safer. It categorically isn't. The physics is in your way.
But that is the argument made when Tesla first said they were going to use only cameras. Because everyone knew lidar would come down with scale and that's why many other manufacturers went in that direction. Which is mutually beneficial, so Tesla would benefit from joining.
> can outcompete human drivers
Third, be careful with those claims. I'm more willing to believe 3rd party reports like from NHSTA than directly from Tesla [0]
Human suck at driving. Almost 2 million people are killed by vehicles on roads every year. While not all of this is due to limitations of vision, it's certainly a contributing factor. Tesla tries to handwave this saying that their systems don't text and drive, get distracted, etc..., but this strikes me as more "AI will become superintelligent real soon now and render humans obsolete" when actual experience with AI is that it is dumb, forgetful, and prone to hallucinations.
Is Tesla getting into legal mess if they need to add sensors to make self driving work when they already sold that feature to car owners? Would that imply that they need to retrofit already sold cars with upgraded sensor packages?
Yes, and this is already turning out to be a problem for them. They've acknowledged that HW3 is not sufficient, and will be on the hook for those who bought the FSD package with those cars.
That isn't the end of the world, but it'd turn into a much bigger problem if they also had to add additional sensors and body modifications to support those sensors.
For a long time, I honestly thought their solution might be something like this. Either that or they could ship more advanced hardware for 3-5 years before updating the software, so that most vehicles would have the new hardware.
IIRC Musk specifically said that the cars had sufficient hardware for FSD mode and advertised them as such. Tesla would have to retrofit the LIDAR sensor or pay money back to their customers if they rolled out FSD with LIDAR.
> visible light doesn't penetrate smoke. But radar does.
This is the key insight that frustrates me to no end about the whole thing. We have sensors that are better than human eyes, but we should limit ourselves to that because what? I don't use a calculator because it's slightly better at math, I use a calculator because it's fucking awesome at multiplying numbers in a way that my human brain can't remotely compete with. I want to be able to see where Elon is coming from but lately I can't.
It strikes me as foolhardy that the U.S. allows self-driving vehicles to use the public roads without having to pass safety tests. I'd blame the lack of government control of public spaces as well as Tesla's engineering.
Musk is allowed to test in production with 1 ton metal machines racing at 100 km/h without it entailing legal responsibility. Amazing the influence that a few million believers on social media can buy you these days
My opinion is that skipping lidar is nonsense. Check Mark Rober's video, published one month ago, "Can You Fool A Self Driving Car?" where he compares a Tesla with other Lidar-equipped cars: https://www.youtube.com/watch?v=IQJL3htsDyQ
There will be time in the very near future (read five years time) people will not buy vehicle (car, bike, etc) without lidar as the price become insignificant as car reverse camera, and it become commonplace.
Personally now I'll not buy any vehicle without assisted camera parking and apparently many people will agree with this important feature including Marques Brownlee [1].
[1] Reviewing my First Car: Toyota Camry Hybrid! [video]:
I can only speak for myself, but I work on this stuff in this industry: Tesla’s choice is asinine at this point. It’s one thing to claim cameras only and find that won’t work and pivot, but they are so dug in that they can’t admit they were wrong and won’t do so. So it’s asinine now.
>It’s one thing to claim cameras only and find that won’t work and pivot, but they are so dug in that they can’t admit they were wrong and won’t do so. So it’s asinine now.
Didn't they start with lidar or radar or similar and then go back to only using vision based technologies?
Humans also have a vestibular system[1] and proprioception[2], which I think are very important in making judgments about safe following distance, car handling, and road conditions, in particular in adverse conditions like strong side winds and slippery or icy roads, which may not always be visually obvious.
While some of this can be handled by an IMU, I think humans still have a strong advantage in fusing their various sensory inputs, thanks to millions of years of evolution.
Well I haven't used taste nor smell in my driving yet. Touch only as far as vibration and steering wheel torque (both not difficult to sense with electronics).
A similar situation to Jobs reciting research on how optimal the one button mouse is. A thought bubble.
Why, we will perhaps never know. But likely they were early and it was deemed too expensive back then, or didn't find a supplier they could work with. Now there's too much prestige in it and they can never back down which would be admitting to a mistake.
It would be one thing if it was a one time event but then they repeated that playbook with the lack of a rain sensor.
I think it was a valid decision that turned out to be incorrect and is staying put as a result of stubbornness. People really like criticizing decisions in hindsight especially here where the armchair engineer with the benefit of hindsight is too common.
People have been criticizing this decision from the get go. It may have upgraded from engineers to the general public but let's be honest, the latter doesn't matter in a topic like this anyways
It was a valid experiment and pushed computer vision but it clearly failed a long time ago. The fact that Teslas are not only still sold without lidar but “autopilot” is pushed as safe is disgusting.
People will die (and have already died) horrifically because of this decision. It’s morally bankrupt.
Radar technology offers a range of applications, including the ability to detect objects around corners, behind obstacles such as brick walls, and even penetrate human bodies at specific frequencies. However, when multiple sensors yield similar results, it becomes challenging and costly to discriminate which results are valid.
Operating radar at a specific frequency, such as 2.45 GHz (a microwave frequency often used due to its affordability), can be ineffective in environments rich in water droplets (e.g., rain), as these can dominate the radar signals. Higher frequencies enable the detection of smaller water droplets, but switching between frequencies can be expensive. Additionally, varying the radar's detection range to identify objects of different sizes complicates the calculations, involving factors such as minimum and maximum range, power, and time on target.
Cameras typically detect non-moving objects by comparing successive images. In contrast, radar can identify both stationary and moving objects and determine their direction relative to the sensor by emitting a frequency and analyzing the reflected pulses. Lidar, on the other hand, uses light to measure the distance to objects in its path, employing a photoreceptor to capture the reflected light.
I was responding to Tesla skipping Lidar but apparently it’s a sore topic. As pointed out by someone else Tesla used to use ultrasonic radar sensors but stopped.
“ Until this month's change, all Model S and Model X EVs intended for North America were equipped with radar sensors but the company has been building new Model 3 and Model Y vehicles without any front radar sensors since May 2021. That's when Tesla announced a change for those models away from radar to Tesla Vision” 0
I don’t think many people understand how these systems work but we’re not on a radar or engineering forum.
This isn't too expensive for Tesla, it's just nowhere near the level needed for an AV. Automotive lidars are 10-20 scans/second, rated for dust/rain/etc, and need a range of at least 50 meters, but 100-200 is more ideal. Not a fan of Tesla's approach, but I wanted to clarify that it's not like they can just use a lidar like this and call it a day. The specs are completely different and that really drives up cost!
> With a horizontal field of view of 30° or more and 576 ranging points (24 x 24), the sensor supports a frame rate of 30 fps, with a reduced 15 fps mode for maximum distance operation.
> requirements for electronics in a car are pretty extreme
+ the salaries of everyone working on that stuff, not just assembly but also writing the code to support it
Not that I disagree, either: at the volumes that a modest car company puts out, I'd assume it's easily worth the, say, 3% cost premium on the car's total price to have something that can actually see things you don't see and thus makes a safer system. It might even reduce costs by having lower requirements for the vision hardware and software, but that's not something I can know. There's a lot of unknowns here that I think mean we can't really do a good comparison indeed
That hasn’t stopped Tesla before. They have a track record of treating automotive-grade quality standards as optional when doing electronics sourcing[1].
As the article notes, Tesla conveniently “fixed” their thermals and durability issue that caused by inventing a feature called cabin overheat protection and marketing it as for people/animals overheating and not for the non-automotive-spec electronics in the cabin.
If you can’t bring auto quality electronics to the car, just change the car so it avoids standard auto thermal conditions ¯\_(ツ)_/¯
They wanted to sell “self driving ready” packages 10 years ago, when LiDAR actually was expensive. So at the time, they had to make big deal about LiDAR being unnecessary.
Not by that much current generation hardware for cars is $500-700. And some of the oem expect to bring it price down below $200 with the next generation equipment. Now that BYD put self driving in almost every car it will supercharge adoption and lidar prices might drop even a lot faster with economies of scale.
My (tenuous) understanding is that the challenge with lidar isn't necessarily the cost of the sensor(s) but the bandwidth and compute required to meaningfully process the point cloud the sensors produce, at a rate/latency acceptable for driving. So the sensors themselves can be a few hundred bucks but what other parts of the system also need to be more expensive?
That seems very unlikely to me. Automotive applications are already doing things like depth reconstruction based on multiple camera angles and ML inference in real time. Why should processing a depth point cloud be significantly more difficult than those things?
The basis for my understanding is a convo with a Google engineer who was working on self-driving stuff around 10-15 years ago -- not sure exactly when, and things have probably changed since then.
At the time they used just a single roof-mounted lidar unit. I remember him saying the one they were using produced point cloud data on the order of Tbps, and they needed custom hardware to process it. So I guess the point cloud data isn't necessarily harder to process than video, but if the sensor's angular resolution and sample rate are high enough, it's just the volume of data that makes it challenging.
Maybe at that time 10-15 years later we have graphic cards doing actual ray tracing lidar computing is way less complex. Anyway the $200 I is for the whole system not just sensors so that would include signal processing
Makes sense. Maybe doing self driving well just requires a ridiculously high bandwidth regardless of data source. Related, the human visual system consumes a surprisingly large quantity of resources from metabolic to brain real estate.
This doesn’t seem to stop Teslas competition in self-driving cars from implementing it; and succeeding far more in safety and functionality while doing so.
Valuation of a statistical life is $5-10M, depending on who you ask[0].
So it’s too much to afford, or at least not singularly justifiable, unless more than 1 out of every 2000 cars kills someone in a way that would be prevented by LIDAR.
What a weird argument by Karpathy. He has a degree in physics. How does this dude not know that radar can see things not possible through camera vision. That argument there doesn't make any sense. That there's supply chains and things break and this makes it unsafe? Well that's true for every single bolt, every nut, ever little thing. I understand a drive to simplicity but you can't just throw fancy words in there like entropy while ignoring the literal physics that says camera + radar is less entropy than camera no radar. There is literally more (unique!) information available to you!
Do we really need LiDAR in a Tesla? I own a Chevy Trax and it has LKAS and ADAS. Not even using LiDAR just sensor fusion with camera and radar. It’s a cheap car too. It’s car assisted driving.
I have driven a Tesla once but not with the added feature.
This isn't really something you'd ship in a car though. It's cool that we have such a rich ecosystem of devices that this can be made "off-the-shelf" - but for production use in a car? Not really practical.
This is the most ungrateful comment I've read today, harping away about how 'it should have been done'.
Well you fucking do it then.
I know that my time is so short (because I have a family) that if I can even do a project then I'm almost certainly not going to document it because getting it done will be enough of a stretch for me, and if I need to come back and re-do it again, I am probably not going to even bother. Not all of us live in mom's basement and have the luxury of extra time.
It was a general suggestion for everyone doing hardware projects and OP did a lookup and provided the additional info / links, which sparked further discussions.
My argument was to do this BECAUSE your time is short. Helping others is the side benefit. There are completely selfish reasons to document. It's more important to document when your time is short and more when it gets interrupted. Unless you have a perfect memory, write it down. Not all of us have the luxury to continually work on a project to maintain context continuously.
Max range 12 meters. That's when it seems to start to get expensive. The light source, filters, and sensors all have to get better.
Good enough for most small robots. Maybe good enough for the minor sensors on self-driving cars, the ones that cover the vehicle perimeter so kids and dogs are reliably sensed. The big long-range LIDAR up top is still hard.
I'd like to know where this price jump really comes from. Google doesn't help me. My first guess is that laser safety becomes an active control process at this point - laser scanner mirror needs to keep moving to not be able to deposit a damaging amount of energy onto a human retina. So you need a safety critical control system to constantly monitor mirror speed and and position and shut down the laser when it becomes too slow. How wrong am I?
More output power, larger optics, more sensitive detectors, more rejection of unwanted light, more pixels, larger rotating machinery, active stabilization... And the big units are low volume.
Here's a top of car LIDAR you can buy for about US$27,000.[1] 128 pixels high sensor, spinning. This is roughly comparable to Waymo's sensor.
It may be in the project as I just scanned through i (but will read through it properly soon), but do you have any of the data for accuracy? Say, over 10M (Or less, if this lidar doesn't work at that distance).
I'm familiar with the FARO scanners which have a different type of mechanism. Their accuracy is good enough for building things.
I've discovered there's several markets for scanners… among those are people who need accuracy and people who are creating content for media like games.
Thank you so much for sharing this project. It's truly unbelievable.
That sounds like it could be in the range of a digital read-out (dro) as they are used for milling machines and lathes. my mechanics had video just the other day about replacing one, and the old one had 5 microns accuracy.
Does the interval you're measuring move around much?
Can the measurement system touch or be affixed to it?
Sounds like a pair of nice calipers might work. So depending on your precision needs, you might get away with the same approach: sliding grid of capacitive cells that slide over the measurement cells. Microcontroller measures them as it slides through. Atan2() for the final result. The meter only part of this is called a DRO(Digital ReadOut)
Thanks for sharing this video, I am also interested in this exact thing. However from my understanding with an approach like this you are limited by the size of the image sensor, meaning if my surface has a bump that is larger than the size of the image sensor it would not get measured. Any idea on how to make something like this work if the goal was to measure slightly larger topographical changes at a less granular resolution like in the 100mm range?
Yea I have some ideas, but I haven't found an easy way to implement it yet.
The term of art that I'm exploring is called "Holographic Interferometry".
Sibling poster gave you a link to regular interferometry.
But basically, if you split the laser beam, one goes straight into the camera sensor, and the other off your object, you can do some pretty amazing things. Depending on a lot of little details (The devil is hiding here).
I found 3Blue1Brown's explanation to be the best, but less "ready to use".
Very very cool. I think what I am looking to do is much more simpler, which is trying to create precise mappings of golf greens. Starting out just using a simple ToF sensor and running it over the surface.
Note those 532nm green DPSS lasers are repeatable within +-1nm across their normal operating temperature. Adding a 20nm wide OD6 narrow band-pass filter to a $5 5mW DPSS laser module is the cheapest precision money can buy these days.
Really depends what one can get away with in the mechanism being built. Note, many machines will fall under export restriction, and as a company people have to decide whether that encumbrance is worth the hassle.
Really cool man, will need to spend some time to fully wrap my head around this. Wonder if this is going to provide maybe too much granularity, as I am measuring slopes variation of say 2cm/100cm like a golf green. Trying out just using basic ToF sensor as of now.
We had a similar issue at one point, and had to build something custom that cost way more than I'd like to admit. Thus, I would recommend just looking at DRO kits for CNC milling machines.
If your project is not budget constrained, than there are complete closed-loop stage solutions around:
The sketchfab examples are fantastic, to be able to move around in a 3D space, like it's some kind of scifi simulation.
The mouse controls are confusing the heck out of me. It shows a 'grab' icon but nothing about it grabs as the movement direction is the opposite, feels completely unnatural.
I've been toying with photogrammetry a little bit lately, specifically for scanning indoor rooms and spaces. So far I'm finding metashape the most suitable for it, but some of the precision isn't great (but I'm still improving my technique). I mostly want to convert the interior of one real building into a digital model for preservation and analysis. I've briefly considered LIDAR, but put it in the too hard/expensive bucket. This project seems to challenge that assumption.
What does the software post-processing look like for this? Can I get a point cloud that I can then merge with other data (like DSLR photographs for texturing)?
I see in their second image[1] some of the wall is not scanned as it was blocked by a hanging lamp, and possibly the LIDAR could not see over the top of the couch either. Can I merge two (or more) point clouds to see around objects and corners? Will software be able to self-align common walls/points to identify its in the same physical room, or will that require some jiggery-pokery? Is there a LIDAR equivalent of coded targets or ARTags[0]? Would this scale to multiple rooms?
Is this even worth considering, or will it be more hassle than its worth compared to well-done photogrammetry?
(Apologies for the peak-of-mount-stupid questions, I don't know what I don't know)
Shameless plug, but if you own an iPhone pro or iPad Pro (which have Lidar integrated), you should give Dot3D a try. It does everything you describe and we made it very easy to use.
Thank you for the reply. Unfortunately, I don't own an iPhone - maybe I can borrow one, though. Any limitations of the app or practical advice you might want to share?
Hi! Thanks for sharing this amazing work. I’m curious about the scalability and performance of PiLiDAR when deployed on large-scale outdoor datasets. Have you benchmarked it on datasets like SemanticKITTI or nuScenes? If so, could you share any insights on runtime, memory usage, and how well it generalizes beyond the indoor scenes used in your paper?
I think you (or me, please correct me if that's the case) misunderstood something here - this is a diy lidar scanner for data acquisition - these datasets are mostly created using rgba cameras and the point clouds are later created with some post processing step.
So it's not a model for processing data but rather a hardware hack for having a real lidar - as in real depth data.
Oh hey! This is exactly what I was looking for just a couple weeks ago! I've had parts to prototype something roughly equivalent to this sitting in my cart on Amazon for a couple weeks now, but I've been very uncertain on my choice of actual lidar scanner.
I'll have to look into this as a starting point I get back from Easter vacation
For home improvement projects, This could be quite useful for generating point cloud map of places hard to get to. Like I have drywall installations I would love to get behind and check how things look, this would be great for that.
This is a very legit and good idea. A simple stud-finder like tool to map out behind walls would be incredibly useful for folks who run cabling or whatnot.
GY-521 in particular and MPU6050 in general make quite poor IMUs. Why do you use them? And what for in this particular case?
What do they do in this set up?
Yaw drift is my problem, so I tried a bunch of IMUs.
Ones built around BNO055 seem to be alright and they are not that much more expensive.
I ended up using Adafruit's.
I think it's unlikely because both lidars have to be pointing at exactly the same place at exactly the same time and using the same frequency. Not impossible but probably not big deal.
It's not obvious what the heck this is without reading into it. A full 4pi steradian scanner? a 360 degree 1 channel LIDAR? A fisheye camera plus some single channel LIDAR plus monocular depth estimation networks to cover everything not in the plane of the lidar?
It would be great to clarify what it is in the first sentence.
I believe it's a 360deg planar lidar mounted on a vertical plane, with a motor to rotate it around and slowly cover a full 4pi sphere. There's also a fisheye camera integrated in. This is a pretty common setup for scanning stationary spaces (usually tripod mounted)
It's impressive that the cost of usable LIDAR tech is well within the reach of personal projects now. The sensors used on the first self-driving cars (from companies like SICK, etc.) likely perform much better but the price point of multiple k$ is not really viable for experimentation at home.
Not to make everything political, but I wonder how the US tariffs will affect electronics-adjacent hobbies. Anecdotally, the flashlight community on Reddit has been panicking a little about this.
I'm sure most electronic hobby projects are going to be financially out of reach for many people for awhile at least. Many people who run businesses that are running small homebrew projects are struggling, too [1]. But it can be extremely hard to tell what might happen with a POTUS who seems to change his mind on what tariffs should be implemented on a whim with zero thought process put into it, no prior notice when they're going to be implemented or removed and then implemented again times 500% or whatever.
I know the Hong Kong post also recently blocked outbound packages entirely sent to the US [2], so I don't know how that's impacting shipments of tech like this & etc byt would be curious to know.
Never mind hobbyists - I work in electronics R&D and my two favorite suppliers are US based even though I am not. Anxious to see how this plays out and that's not even considering our production departments.
> Not to make everything political... [proceeds to make a political statement]
For what it's worth, this type of Lidar scanner was possible to make well over a decade ago with ROS1, a Phidgets IMU, a webcam, and a lidar pulled out of a Neato vacuum (the cheapest option at the time). This would be around the difficulty of a course project for an undergraduate robotics class and could be done with less than 200 USD of salvaged parts (not including the computer). Hugin was also around over a decade ago.
I would not consider asking a question about the impact of current events on a market segment relevant to the discussion topic to be political. The disclaimer is presumably to encourage respondents not to drag things in an off topic direction. Ironic, considering the outcome.
This seems to be using classic formula -> get trivial, ready made component, design 3D printed enclosure and hook it up to Raspberry Pi. Instant Hacker News homepage.
So, where's your hardware project? Have you ever made one? I think you're underestimating the amount of time and effort that went into the linked project.
Not to make everything political... [proceeds to make a political statement]
Being all polite and non-political and shit is what brought us to this pass.
Never lose an opportunity to make the people who voted for the current state of affairs feel isolated, rejected, guilty, and generally bad. Being nice to them doesn't work.
Please, I don't want to come on to HN to see politics injected into everything. Stay on reddit for that.
I logged in to make a comment regarding something within my area of expertise: the technology present in the parent link and how this technology has been accessible to hobbyists for over 10 years.
>I don't want to come on to HN to see politics injected into everything
If it's political to wonder how tariffs impact the cost of the project we're discussing, then everything is political, and it's pointless to complain about politics being "injected into everything."
>You’re feeding into the confirmation bias I already have about how the opposition thinks
It's wild that you acknowledge your cognitive bias and then blame others for it instead of working on it. If I wrote something like that, I hope I would have the wherewithal to notice that something is seriously wrong with my thinking.
Yes the opposition thinks evil is evil. The opposition also thinks water is wet. Check back here tomorrow for more obvious things rational people think.
The opposition reductively believes this is an existential battle between “good and evil”, they’re the “good”, and that’s a position from which one can justify almost anything to eradicate “evil”.
Well, Trump is the one that almost always frames things in very binary way. If someone contradicts him, it is "fake news". His opposition is typically much less so, and much more rational and thoughtful.
Even many in the opposition agrees with many of his goals (control immigration, protect American industries, shrink the government).
You can always know, if you want to, by actually engaging in constructive dialog. Which probably isn’t going to happen in this thread because it’s ostensibly about a raspberry pi LiDAR scanner, and thus neither really the time nor place.
The MAGA crowd is not even remotely interested in 'constructive dialog' and is so far down the hole of drinking the kool-aide, constructive dialog with them will likely never be possible.
You cannot have constructive dialog about astronomy with someone who thinks the sky is made of green and purple polkadots because that's what someone told them, and dismiss all evidence to the contrary as a massive conspiracy.
They don't even believe in democracy or constitutional rights - at least, for anyone but them.
It's true, a Hokuyo or a Sick that sold for several thousands a decade ago is laughably bad compared to something under $100 from Shenzhen these days. When there's a need there's a way, I guess.
I hope they decide to develop some disruptive stereo/structured light/tof cameras eventually too, those are still mostly overpriced and kinda crap overall.
Short term there's some suffering but while hobbyists are definitely more price sensitive, they are also the most flexible ones. In production you don't just need one piece, you need a steady supply and any change of components affects the whole product.
How China/US interact will determine the longer term future of that economic relationship but many companies are already adjusting because he future is currently uncertain. With the free trade agreement with the EU and more producers moving to the US I think that it's been a good disruption even if I'm now also scrambling to find alternative PCB manufacturers.
How many will follow through with these announcements? During Trump's first term, announcing huge projects in the US and then not following through was a common tactic for companies dealing with Trump. Foxconn, for example, announced a new $10 billion factory in Wisconsin. They made some initial investments and stopped when people stopped paying attention. Instead of the promised 13.000, they now employ about 1.000 people there.
And what about all the companies that will have gone out of business by then? This mainly affects small companies, which are exactly the companies you need for a healthy economy. In some cases, they have shipments already paid for that they can't accept because they don't have the liquid assets to pay the unexpected tariffs, so these companies are now at risk of going out of business completely unnecessarily.
It never makes sense to use tariffs for economic reasons. It just does not work. Tariffs can make sense for strategic reasons if you're willing to take an economic hit to lower dependence on other countries for critical industries or technologies. However, the idea that taxes are ever "a good disruption" for the economy does not bear out.
>It never makes sense to use tariffs for economic reasons. It just does not work.
This week two USA companies from which I bought some products from Europe sent me an email explaininig how they have to rise their prices due to tariffs, as they need to import from China for now.
Guess who will be faster: these companies finding an alternative supplier in the US that match China quality-price, or I finding an alternative supplier from China? They just admited that they are buying from China anyways.
This is really cool
One thing I'd suggest, for any hardware product, is that when doing your bill of materials to provide links and show estimated costs. Sure, these will change but having a rough idea of the costs is really helpful, especially when perusing on from things like HN. It can be a big difference for someone to decide if they want to try it on their own or not. It is the ballpark figures that matter, not the specifics.
You did all that research, write it down. If for no one but yourself! Providing links is highly helpful because names can be funky and helps people (including your future self) know if this is the same thing or not. It's always noisy, but these things reduce noise. Importantly, they take no time while you're doing the project (you literally bought the parts, so you have the link and the price). It saves yourself a lot of hassle, not just for others. Document because no one remembers anything after a few days or weeks. It takes 10 seconds to write it down and 30 minutes to do the thing all over again, so be lazy and document. I think this is one of the biggest lessons I learned when I started as an engineer. You save yourself so much time. You just got to fight that dumb part in your head that is trying to convince you that it doesn't save time. (Same with documenting code[0])
Here. I did a quick "15 minute" look. May not be accurate
That gives us $200-$280 before counting the power supply and buck converter.[0] When I wrote the code only me and god understood what was going on. But as time marched on, now only god knows.
Learning projects like this is about to get a lot less accessible due to the extreme tariffs and elimination of the de minimis exemption. Take that BOM and multiply it by 2X or 3X depending on the source and how many different shipments arrived.
I can’t tell you how depressing it is to go from having access to cheap learning materials for introducing kids (and adults) to electronics, and now it’s being taxed away in the name of improving the US competitiveness or something. Total footgun.
I’ve been jokingly calling my electronics hobby of the past few years “getting my EE degree.”
Putting together PCBs, reading and replicating schematics, designing my own hardware. It’s been really fun and has paid dividends for my career (firmware). If the US is interested in bringing knowledge of manufacturing back this is a very very bad way to do it. How many undergrad projects are now impossible because the BOM has quadrupled? How many future mechanical/electrical engineers are not going to get into hardware because of this?
It’s all going to be gone. I’ve spent roughly $1200 last year having PCBs and ordering sensors etc. that goes to $4000 with these executive orders.
It’s insanity, pure insanity.
There is a small part of that wonders if it is by design given how much it exposes a person to the inner workings of the things they use daily, but then I bring myself back to reality. I sincerely doubt now there is a real long term plan at work here. At best it is all situational and reactive ( tactical, maybe, but not strategic thinking at work ).
Both can be true. There was incentive to prevent people from building, repairing, and modifying things. It need not be malicious with the intent to prevent people from getting into the field. Instead it only need be naïve and selfish. Sony suing because people put Linux on their PS3. Remember these people will also try to sue white hat hackers who submit vulnerability reports. It's not a long term plan to destroy, instead it's an extremely short sighted and naïve plan to build a wall and protect.
You'll find that naïve shortsightedness often is indistinguishable from malicious foresight, when looked back upon. So remember how to stop it: think of the little things, the compounding effects. There's always costs and trades being made, there's no free lunch. As the world gets more advanced it gets more complex. The more complex it gets the more those little subtle things matter
There is no plan it’s instincts. Complete devoid of actual thoughts.
Side hobby insantities not limited to 'ee degree hobbies'! -> Many companies got started as 'side ventures' aka Facebook, MS Windows, Dell, Nerf and multitude of other companies.
Use VR / simuations to cut down on Milton Hershey[1] cost issues. Simulations can allow for 'timing the tarrifs'.
North American FPGA's may be a thing again! (once pi systems superceeds lambda systems as the multi-FPGA board connection theory of choice :-)
Save on robotic automation costs via software scripts!
Perhaps 3d printed circuit boards will now be a more realisitic alterntative than traditional outsourcing board production? [2][3][4]
Perhaps there'll be some motivation/inspiration for creating/combining standard programming logic/language/3d 'threaded' sock it programming by combining (2021) mathematics of knitting[5]; and more modern / inexpensive / upgraded 'general computational device' derived from apollo guidence computer[6] concept. aka instead of wasm, weave-em.
Do have to get over the 'stich in time, saves nine', since most modern computation 'saves in 8'. Easy to unravel program(s). Tight knit spreadsheets!
But, computational e-ink might scale better using DBOS concepts[7].
-------------------------------------------------------
[1] : https://hersheystory.org/milton-hershey-history/
[2] : https://www.reddit.com/r/3Dprinting/comments/1ajll6s/3d_prin...
[3] : direct desk top printed circuits on paper flexible electronics : https://www.nature.com/articles/srep01786
[4] : 3d printed circuit board : https://all3dp.com/1/3d-printed-circuit-boards-pcb/
[5] : https://www.sciencenews.org/article/how-one-physicist-unrave...
[6] : https://nerdfighteria.info/v/f2ZCVnk-oRU/
[7] : https://en.wikipedia.org/wiki/DBOS
Doesn't change the fact that the advice is still beneficial. At worst you still have a good history of the effect of these tariffs.
I'd call the tariffs the second death of hardware though. The first was when we killed all the parts stores. That was a slower death, coupled with the loss of right to repair. But we've been making big strides in that domain, so I hope we can undo that death. If we can also undo the dumb tariffs too then ironically we might have a chance to bring back hardware which somewhat seems inline with what that party (claims to/pretends to) wants.
> we killed all the parts stores
We ran out of people buying from parts stores - hobby electronics became less popular.
Eras of hobbies:
It is not just a question of popularity. Post 9/11 skillsets around chemical ( or other scary looking domains ) became vilified within general populace and then that vilification was normalized with shows like 24 ( which is a great show... it just sucks that people are unable to separate the world it presents from their world ). It does not help with the advent of red flag laws, where neighbors can effectively call cops on you if they see something that bothers them..
I'm not aware of the chemical hobby era. And as I watched it happen, the electronic hobby era died at the same time as the mechanical. Most of the mechanical one was cars. It happened as it became harder to repair things. The barrier to entry increased, the availability of parts decreased with decreasing demand. It is a negative feedback loop.
Mechanical: I was thinking more Meccano and home built steam engines - cars came later.
Chemical: I was thinking chemistry sets and the age when Chemistry was geeky cool (Du Pont, Uncle Tungsten).
My grand father was a structural engineer, my father studied chemical engineering, I studied electronic engineering (but got a job programming). So perhaps I was thinking more of the frontier of engineering shifting rather than technical hobbies.
It would be interesting to look at geeky hobby adverts in a magazine over time and see how the advert focus shifted.
On the other hand, most mechanical things were relatively cheap. Cars were worked on since cars existed. People also worked on everything in their homes. There was the tradesman who takes a specific skill like being able to fix a washing machine, but a lot of these people had hobbies of building and making.
So we have a direct connection to the death of the repairman and death of the handyman. These were quite popular things even up through the 90's. You'll even see this in shows and movies.
Plus, you forgot the most popular hobby of them all: woodworking. Still alive, but nowhere near as popular as a few decades ago.
> Most of the mechanical one was cars. It happened as it became harder to repair things.
I don't think it has become harder to repair cars though. Most problems that need repair are the same old bushings, brakes, spindles, rust, bearings ...
I think it is some cultural change away from handy work in general.
Electronics have become harder to repair with smaller and more integrated circuits though.
Yeah, totally—once fixing stuff got harder and parts disappeared, the whole scene just kind of faded out.
Seems to me it's integration of whole systems.
Telling AI to write computer software.
Mechanical.
Circle of life. ;-)
Doomscrolling and influencing.
A new Radio Shack dealer just opened in a rural area, with an emphasis on radio (it's near an off-roading park), and your comment reminded me what terrible timing this is for them. Such a cruel twist.
https://tekshack.com
Yeah, it's already been like this in a lot of Europe where there is no De Minimis exemption.
E.g. in Sweden, PostNord has a government granted monopoly and charge about $20 per imported package, which adds up fast.
It really sucks, free trade and competition is what we need.
This is not correct. There absolutely is a De Minimis exemption for tariffs. You always have to pay VAT, which you didn't always have to do before. You also don't have to pay that fee, you can order from a store that supports IOSS. The big ones do nowadays. You can also choose to declare the package yourself, but you need to live close to a tariff declaration office for this to be feasible. You can also have it delivered through a different delivery company, they usually declare it for you.
Why didn’t someone just set up domestic manufacturing for those hobbyist parts?
Assuming this wasn’t sarcasm: For many hobbies the parts come from all over the world. You can’t expect hobbyists in every country to set up manufacturing for every part.
Even if they did, the raw materials have to come from other countries. The machines probably come from other countries. Setting up little factories all over the world isn’t efficient so prices would be extremely high. Parts might be cheaper importing from other countries even with extreme tariffs.
It’s all just a mess of bad policy. We lose out when governments restrict our ability to make small, simple purchases from other countries without heavy cost overhead.
Sweden as an example is 10 million people. The world is 8000 million, 800x larger. The affluent parts of the world are somewhere between 100-1000 million, so the market potential is 10-100x larger. If one is able to address the global market, the economy of scale around manufacturing will be heavily in favor of that. This is particularly true for hobbyist stuff, because it tends to be both (relatively) low volume and low margin. And also price sensitive, in that people might just adjust their hobbies based on what is affordable/not.
This is just demand side. Of course producing in Sweden will be more expensive than in China - for electromechanical things, Shenzhen is likely better on every single metric...
Comparative advantage
Is this sarcastic? It does read like it.
I’ll take it on face value:
Cause domestic manufacturing for hobbyist parts is not economically viable
I wonder how that additional cost breaks down. Is it mostly cost of labor? Supply chain access? Environmental controls and compliance? Other overheads not present in China? Is economically viable production possible in the US?
For hobby parts? It’s not viable because you’d be setting up a manufacturing operation to serve a small number of people. The fixed costs would be so high you’d never get it back.
Contrast that with someone setting up an operation to serve the entire world, a market 1000 times larger than many localities.
If you're smart enough to design and build a LIDAR, you're smart enough to make $200,000 a year working for the big adtech companies.
You'd need an enormous hobbyist robotics market to be able to sustain a business making and selling $60 LIDARs with that wage bill.
For Americans, that is.
Come to Australia
I'm happily in the EU
Local warehousing / distribution centers? The US is not short of space.
[dead]
The good thing it’s on GitHub so you can submit a pull request for a BOM to help the person out.
I know what a buck converter is but I wonder why it's called 'buck converter'. A dictionary doesn't seem to have any words that mean roughly "reducing the amount of something". [0]
Similarly, Wikipedia doesn't elaborate on the etymology. [1]
[0] https://www.merriam-webster.com/dictionary/buck
[1] https://en.wikipedia.org/wiki/Buck_converter
As I recall, it is the surname of the inventor. Perhaps William Buck.
Incredible that this is too expense for a company like Tesla.
What is the HN opinion on Tesla skipping lidar? Having spent some time with computer vision in university I think it's insane to skip it - sure stereo reconstruction is powerful but lighting conditions have such an impact on results that having some robust depth data feels like a no-brainer and skipping it feels like malignant neglect.
As someone who's done a lot of computer vision, it is insane to skip it. And it's sad because what everyone missed from that viral Mark Rober video [0] was not the Looney Toons wall hit but the fucking kid in the smoke. Add all the cameras and AI you want, you ain't changing the laws of physics: visible light doesn't penetrate smoke. But radar does. Every (traditional) engineer knows that safe systems have redundancy. That safe systems have redundancy through differing modalities. Use cameras, but also use radar, lidar, and even millimeter wave. Using just cameras isn't just tying one hand behind your back, it's shooting yourself in the kneecap afterwards
[0] https://www.youtube.com/watch?v=IQJL3htsDyQ
The argument is that humans manage the task without lidar, and automation doesn't have to be perfect it just has to be better than humans, to be a net positive. It seems to me, you might as well use lidar if it's cheap enough, but the argument that computer systems can outcompete human drivers, without using lidar, is at least reasonable, although not yet proven.
Extending this line of thought I wonder why tesla didn't make cars on two legs and insisted on using wheels?
(Just wanted to make sure - this is not a stab at you, I'm well aware that the original argument is from tesla)
We could extend the argument more. Why build a self driving vehicle at all? Build a humanoid robot to drive the car for you! The argument that computer systems can outcompete human drivers, without using lidar, is at least reasonable, although not yet proven
(I didn't just want to just make sure - this is a stab)
It's a dumb argument on multiple accounts. While it's a routine argument in software engineering it's an argument that often will get people fired and sued in testimonial.
First off, humans don't do it "just by vision". Sure, we don't have lidar but we have hearing, we have touch, we have tons of experience. We can create world models for Christ's sake and that means modeling physics. I'm sure you've seen papers that claim world models but I'm a ML researcher who also has a physics degree and I'm not afraid to tell you that's bullshit. It's as honest as Altman calling GPT PhD level intelligence. A PhD has very little to do with the ability to recall information.
Second off, it doesn't matter much how humans do it. It matters how the car can. Why limit yourself. There's tons of cars with radar and lidar. They're not more expensive and they can see an object in fog or poor light conditions. It can do something humans can't do! Why in the world would you decide not to do that. You can make an argument about price but that argument changes when that thing becomes cheaper. When that happens you're now just someone adding danger for no reason. You can't argue that only cameras will be safer. It categorically isn't. The physics is in your way.
But that is the argument made when Tesla first said they were going to use only cameras. Because everyone knew lidar would come down with scale and that's why many other manufacturers went in that direction. Which is mutually beneficial, so Tesla would benefit from joining.
Third, be careful with those claims. I'm more willing to believe 3rd party reports like from NHSTA than directly from Tesla [0][0] https://www.forbes.com/sites/stevebanker/2025/02/11/tesla-ag...
Human suck at driving. Almost 2 million people are killed by vehicles on roads every year. While not all of this is due to limitations of vision, it's certainly a contributing factor. Tesla tries to handwave this saying that their systems don't text and drive, get distracted, etc..., but this strikes me as more "AI will become superintelligent real soon now and render humans obsolete" when actual experience with AI is that it is dumb, forgetful, and prone to hallucinations.
Is Tesla getting into legal mess if they need to add sensors to make self driving work when they already sold that feature to car owners? Would that imply that they need to retrofit already sold cars with upgraded sensor packages?
Yes, and this is already turning out to be a problem for them. They've acknowledged that HW3 is not sufficient, and will be on the hook for those who bought the FSD package with those cars.
That isn't the end of the world, but it'd turn into a much bigger problem if they also had to add additional sensors and body modifications to support those sensors.
Solution: just never implement FSD.
For a long time, I honestly thought their solution might be something like this. Either that or they could ship more advanced hardware for 3-5 years before updating the software, so that most vehicles would have the new hardware.
My understanding is that they went the opposite direction - their cars used to have lidar, but don’t anymore.
Worse, they turned them off for the older vehicles with a software update.
They never had lidar. They had a very low resolution radar that was used for AP, and some pretty terrible ultrasonic sensors with massive blind spots.
IIRC Musk specifically said that the cars had sufficient hardware for FSD mode and advertised them as such. Tesla would have to retrofit the LIDAR sensor or pay money back to their customers if they rolled out FSD with LIDAR.
> visible light doesn't penetrate smoke. But radar does.
This is the key insight that frustrates me to no end about the whole thing. We have sensors that are better than human eyes, but we should limit ourselves to that because what? I don't use a calculator because it's slightly better at math, I use a calculator because it's fucking awesome at multiplying numbers in a way that my human brain can't remotely compete with. I want to be able to see where Elon is coming from but lately I can't.
> What is the HN opinion on Tesla skipping lidar?
Short-sighted and egotistical.
There likely have been deaths and injuries that would have been prevented by lidar, and there will likely be more in the future.
It strikes me as foolhardy that the U.S. allows self-driving vehicles to use the public roads without having to pass safety tests. I'd blame the lack of government control of public spaces as well as Tesla's engineering.
Musk is allowed to test in production with 1 ton metal machines racing at 100 km/h without it entailing legal responsibility. Amazing the influence that a few million believers on social media can buy you these days
My opinion is that skipping lidar is nonsense. Check Mark Rober's video, published one month ago, "Can You Fool A Self Driving Car?" where he compares a Tesla with other Lidar-equipped cars: https://www.youtube.com/watch?v=IQJL3htsDyQ
interesting claim i read in another thread a couple weeks ago:
>Tesla Vision is, currently, legally below minimum human vision requirements and has historically been sold despite being nearly legally blind.
https://news.ycombinator.com/item?id=43605034
> the HN opinion
I'm not sure why you'd think HN has a monolithic opinion, this is a site with myriad different users.
Maybe they're more asking for the whole breadth of opinions available from the HN community?
True, but you will find that some subjects align people more closely than most others.
There will be time in the very near future (read five years time) people will not buy vehicle (car, bike, etc) without lidar as the price become insignificant as car reverse camera, and it become commonplace.
Personally now I'll not buy any vehicle without assisted camera parking and apparently many people will agree with this important feature including Marques Brownlee [1].
[1] Reviewing my First Car: Toyota Camry Hybrid! [video]:
https://youtu.be/Az6nemkRB1Y
I can only speak for myself, but I work on this stuff in this industry: Tesla’s choice is asinine at this point. It’s one thing to claim cameras only and find that won’t work and pivot, but they are so dug in that they can’t admit they were wrong and won’t do so. So it’s asinine now.
>It’s one thing to claim cameras only and find that won’t work and pivot, but they are so dug in that they can’t admit they were wrong and won’t do so. So it’s asinine now.
Didn't they start with lidar or radar or similar and then go back to only using vision based technologies?
What do yes call a person that has only visual sensory? Disabled.
Humans have sight, touch, taste, sound, smell, and vascular sensory. Only a portion of systems used in self drive automation.
Humans also have a vestibular system[1] and proprioception[2], which I think are very important in making judgments about safe following distance, car handling, and road conditions, in particular in adverse conditions like strong side winds and slippery or icy roads, which may not always be visually obvious.
While some of this can be handled by an IMU, I think humans still have a strong advantage in fusing their various sensory inputs, thanks to millions of years of evolution.
[1] https://en.wikipedia.org/wiki/Vestibular_system
[2] https://en.wikipedia.org/wiki/Proprioception
Well I haven't used taste nor smell in my driving yet. Touch only as far as vibration and steering wheel torque (both not difficult to sense with electronics).
That narrows it down a bit.
Never once smelled or tasted something funny when driving and realize you’d better pull over… like… right now? I have.
A similar situation to Jobs reciting research on how optimal the one button mouse is. A thought bubble.
Why, we will perhaps never know. But likely they were early and it was deemed too expensive back then, or didn't find a supplier they could work with. Now there's too much prestige in it and they can never back down which would be admitting to a mistake.
It would be one thing if it was a one time event but then they repeated that playbook with the lack of a rain sensor.
I think it was a valid decision that turned out to be incorrect and is staying put as a result of stubbornness. People really like criticizing decisions in hindsight especially here where the armchair engineer with the benefit of hindsight is too common.
People have been criticizing this decision from the get go. It may have upgraded from engineers to the general public but let's be honest, the latter doesn't matter in a topic like this anyways
They should have had the benefit of hindsight as well thanks to testing.
It was a valid experiment and pushed computer vision but it clearly failed a long time ago. The fact that Teslas are not only still sold without lidar but “autopilot” is pushed as safe is disgusting.
People will die (and have already died) horrifically because of this decision. It’s morally bankrupt.
I assume the same would apply to any car not using LIDAR? Or just Tesla because they decided on a different tech stack?
Radar technology offers a range of applications, including the ability to detect objects around corners, behind obstacles such as brick walls, and even penetrate human bodies at specific frequencies. However, when multiple sensors yield similar results, it becomes challenging and costly to discriminate which results are valid.
Operating radar at a specific frequency, such as 2.45 GHz (a microwave frequency often used due to its affordability), can be ineffective in environments rich in water droplets (e.g., rain), as these can dominate the radar signals. Higher frequencies enable the detection of smaller water droplets, but switching between frequencies can be expensive. Additionally, varying the radar's detection range to identify objects of different sizes complicates the calculations, involving factors such as minimum and maximum range, power, and time on target.
Cameras typically detect non-moving objects by comparing successive images. In contrast, radar can identify both stationary and moving objects and determine their direction relative to the sensor by emitting a frequency and analyzing the reflected pulses. Lidar, on the other hand, uses light to measure the distance to objects in its path, employing a photoreceptor to capture the reflected light.
That'll likely be why LIDAR is used rather than RADAR.
I was responding to Tesla skipping Lidar but apparently it’s a sore topic. As pointed out by someone else Tesla used to use ultrasonic radar sensors but stopped.
“ Until this month's change, all Model S and Model X EVs intended for North America were equipped with radar sensors but the company has been building new Model 3 and Model Y vehicles without any front radar sensors since May 2021. That's when Tesla announced a change for those models away from radar to Tesla Vision” 0
I don’t think many people understand how these systems work but we’re not on a radar or engineering forum.
0. https://www.caranddriver.com/news/a39250157/tesla-no-radar-s...
This isn't too expensive for Tesla, it's just nowhere near the level needed for an AV. Automotive lidars are 10-20 scans/second, rated for dust/rain/etc, and need a range of at least 50 meters, but 100-200 is more ideal. Not a fan of Tesla's approach, but I wanted to clarify that it's not like they can just use a lidar like this and call it a day. The specs are completely different and that really drives up cost!
Maybe something like this: https://linuxgizmos.com/sony-introduces-as-dt1-described-as-...
> With a horizontal field of view of 30° or more and 576 ranging points (24 x 24), the sensor supports a frame rate of 30 fps, with a reduced 15 fps mode for maximum distance operation.
The requirements for electronics in a car are pretty extreme (temp, durability), not that I disagree, but it's not apples to oranges.
> requirements for electronics in a car are pretty extreme
+ the salaries of everyone working on that stuff, not just assembly but also writing the code to support it
Not that I disagree, either: at the volumes that a modest car company puts out, I'd assume it's easily worth the, say, 3% cost premium on the car's total price to have something that can actually see things you don't see and thus makes a safer system. It might even reduce costs by having lower requirements for the vision hardware and software, but that's not something I can know. There's a lot of unknowns here that I think mean we can't really do a good comparison indeed
That hasn’t stopped Tesla before. They have a track record of treating automotive-grade quality standards as optional when doing electronics sourcing[1].
As the article notes, Tesla conveniently “fixed” their thermals and durability issue that caused by inventing a feature called cabin overheat protection and marketing it as for people/animals overheating and not for the non-automotive-spec electronics in the cabin.
If you can’t bring auto quality electronics to the car, just change the car so it avoids standard auto thermal conditions ¯\_(ツ)_/¯
https://www.thedrive.com/tech/27989/teslas-screen-saga-shows...
Don’t they constantly get tested as the safest car in the world? I saw it years ago in some American news and the first google result is from New Zealand last year https://www.drivencarguide.co.nz/news/tesla-model-y-is-the-s...
They wanted to sell “self driving ready” packages 10 years ago, when LiDAR actually was expensive. So at the time, they had to make big deal about LiDAR being unnecessary.
But now it has come down in price reportedly by more than a factor of ten so at some point a logical person would revisit that decision
it's not only about logic; his ego is now involved in it which virtually guarantees it will never be revisited.
Would this not also be the case had Tesla embraced the tech and installed thousands into their cars?
The lidars used on self-driving vehicles are far more capable and far more expensive.
Not by that much current generation hardware for cars is $500-700. And some of the oem expect to bring it price down below $200 with the next generation equipment. Now that BYD put self driving in almost every car it will supercharge adoption and lidar prices might drop even a lot faster with economies of scale.
My (tenuous) understanding is that the challenge with lidar isn't necessarily the cost of the sensor(s) but the bandwidth and compute required to meaningfully process the point cloud the sensors produce, at a rate/latency acceptable for driving. So the sensors themselves can be a few hundred bucks but what other parts of the system also need to be more expensive?
That seems very unlikely to me. Automotive applications are already doing things like depth reconstruction based on multiple camera angles and ML inference in real time. Why should processing a depth point cloud be significantly more difficult than those things?
The basis for my understanding is a convo with a Google engineer who was working on self-driving stuff around 10-15 years ago -- not sure exactly when, and things have probably changed since then.
At the time they used just a single roof-mounted lidar unit. I remember him saying the one they were using produced point cloud data on the order of Tbps, and they needed custom hardware to process it. So I guess the point cloud data isn't necessarily harder to process than video, but if the sensor's angular resolution and sample rate are high enough, it's just the volume of data that makes it challenging.
Maybe at that time 10-15 years later we have graphic cards doing actual ray tracing lidar computing is way less complex. Anyway the $200 I is for the whole system not just sensors so that would include signal processing
Makes sense. Maybe doing self driving well just requires a ridiculously high bandwidth regardless of data source. Related, the human visual system consumes a surprisingly large quantity of resources from metabolic to brain real estate.
The whole point of lidar is to massively increase the amount of ranged data you have to work with.
This doesn’t seem to stop Teslas competition in self-driving cars from implementing it; and succeeding far more in safety and functionality while doing so.
What is the cost of a human life worth?
edit: seriously, a $4,000 sensor and an extra, say, $3,000 for an upgraded computer module so your car can drive itself is just too much too afford?
Valuation of a statistical life is $5-10M, depending on who you ask[0].
So it’s too much to afford, or at least not singularly justifiable, unless more than 1 out of every 2000 cars kills someone in a way that would be prevented by LIDAR.
0: https://www.sciencedirect.com/science/article/pii/S109830152...
At this point having "something" would probably even beat having nothing.
I guess it's simply a big numbers thing. If you sell lots of cars, shaving a couple of hundred dollars of each car adds up.
Karpathy addressed this question at the time:
https://news.ycombinator.com/item?id=33397093
Of course he was working for Tesla back then. His opinions might be different today given that Elon is no longer signing his paycheck.
What a weird argument by Karpathy. He has a degree in physics. How does this dude not know that radar can see things not possible through camera vision. That argument there doesn't make any sense. That there's supply chains and things break and this makes it unsafe? Well that's true for every single bolt, every nut, ever little thing. I understand a drive to simplicity but you can't just throw fancy words in there like entropy while ignoring the literal physics that says camera + radar is less entropy than camera no radar. There is literally more (unique!) information available to you!
His opinions aren’t much different in interviews I’ve heard since, although of course that doesn’t mean he’s completely unbiased now.
Money better spent on marketing. Like that song about "him having a plan".
After all car sales don't drive the stock market. Public opinion does.
I’ll bet a lot of Tesla investors are wishing neither of those applied these days.
Do we really need LiDAR in a Tesla? I own a Chevy Trax and it has LKAS and ADAS. Not even using LiDAR just sensor fusion with camera and radar. It’s a cheap car too. It’s car assisted driving.
I have driven a Tesla once but not with the added feature.
This isn't really something you'd ship in a car though. It's cool that we have such a rich ecosystem of devices that this can be made "off-the-shelf" - but for production use in a car? Not really practical.
This is the most ungrateful comment I've read today, harping away about how 'it should have been done'.
Well you fucking do it then.
I know that my time is so short (because I have a family) that if I can even do a project then I'm almost certainly not going to document it because getting it done will be enough of a stretch for me, and if I need to come back and re-do it again, I am probably not going to even bother. Not all of us live in mom's basement and have the luxury of extra time.
It was not ungrateful.
It was a general suggestion for everyone doing hardware projects and OP did a lookup and provided the additional info / links, which sparked further discussions.
Chill.
He did 'do it', and saved us all the 10-15minutes it took.
The actual scanners: [1]
Max range 12 meters. That's when it seems to start to get expensive. The light source, filters, and sensors all have to get better.
Good enough for most small robots. Maybe good enough for the minor sensors on self-driving cars, the ones that cover the vehicle perimeter so kids and dogs are reliably sensed. The big long-range LIDAR up top is still hard.
[1] https://www.ldrobot.com/
I'd like to know where this price jump really comes from. Google doesn't help me. My first guess is that laser safety becomes an active control process at this point - laser scanner mirror needs to keep moving to not be able to deposit a damaging amount of energy onto a human retina. So you need a safety critical control system to constantly monitor mirror speed and and position and shut down the laser when it becomes too slow. How wrong am I?
More output power, larger optics, more sensitive detectors, more rejection of unwanted light, more pixels, larger rotating machinery, active stabilization... And the big units are low volume.
Here's a top of car LIDAR you can buy for about US$27,000.[1] 128 pixels high sensor, spinning. This is roughly comparable to Waymo's sensor.
[1] https://www.hesaitech.com/product/ot128/
This is amazing! Thank you!
It may be in the project as I just scanned through i (but will read through it properly soon), but do you have any of the data for accuracy? Say, over 10M (Or less, if this lidar doesn't work at that distance).
I'm familiar with the FARO scanners which have a different type of mechanism. Their accuracy is good enough for building things.
I've discovered there's several markets for scanners… among those are people who need accuracy and people who are creating content for media like games.
Thank you so much for sharing this project. It's truly unbelievable.
Somewhat related. I'm looking for a cheap way to measure distances to approx 10 microns accuracy, over distances on the order of 300mm. Any ideas?
That sounds like it could be in the range of a digital read-out (dro) as they are used for milling machines and lathes. my mechanics had video just the other day about replacing one, and the old one had 5 microns accuracy.
Not sure how must they cost though.
Does the interval you're measuring move around much?
Can the measurement system touch or be affixed to it?
Sounds like a pair of nice calipers might work. So depending on your precision needs, you might get away with the same approach: sliding grid of capacitive cells that slide over the measurement cells. Microcontroller measures them as it slides through. Atan2() for the final result. The meter only part of this is called a DRO(Digital ReadOut)
I have some design ideas for a diy system, how much money/time are you willing to spend for experimentation?
What counts as cheap to you?
I'm thinking about automating something a long these lines:
https://youtu.be/hnHjrz_inQU?si=dNzXVBVFsr7e8m_6
Off the shelf lasers and camera sensors can be hacked around with DIY for some pretty unexpected precision.
Thanks for sharing this video, I am also interested in this exact thing. However from my understanding with an approach like this you are limited by the size of the image sensor, meaning if my surface has a bump that is larger than the size of the image sensor it would not get measured. Any idea on how to make something like this work if the goal was to measure slightly larger topographical changes at a less granular resolution like in the 100mm range?
Yea I have some ideas, but I haven't found an easy way to implement it yet.
The term of art that I'm exploring is called "Holographic Interferometry".
Sibling poster gave you a link to regular interferometry.
But basically, if you split the laser beam, one goes straight into the camera sensor, and the other off your object, you can do some pretty amazing things. Depending on a lot of little details (The devil is hiding here).
I found 3Blue1Brown's explanation to be the best, but less "ready to use".
https://youtu.be/EmKQsSDlaa4?si=j-YJm6scxK6bh_Is
Very very cool. I think what I am looking to do is much more simpler, which is trying to create precise mappings of golf greens. Starting out just using a simple ToF sensor and running it over the surface.
Making a fringe counter out of a Michelson interferometer is a classic project:
https://www.youtube.com/watch?v=j-u3IEgcTiQ
https://www.youtube.com/watch?v=ucuVsReDze0
https://en.wikipedia.org/wiki/Michelson_interferometer
Note those 532nm green DPSS lasers are repeatable within +-1nm across their normal operating temperature. Adding a 20nm wide OD6 narrow band-pass filter to a $5 5mW DPSS laser module is the cheapest precision money can buy these days.
Really depends what one can get away with in the mechanism being built. Note, many machines will fall under export restriction, and as a company people have to decide whether that encumbrance is worth the hassle.
Best of luck =3
Really cool man, will need to spend some time to fully wrap my head around this. Wonder if this is going to provide maybe too much granularity, as I am measuring slopes variation of say 2cm/100cm like a golf green. Trying out just using basic ToF sensor as of now.
Maybe check out this for ideas? https://youtu.be/qMYBwbTIL-0
OCT.
There are cheap OCT systems?
For what purpose?
https://xyproblem.info/
Answering their question would be more helpful here, even if it doesn't solve their problem.
Not OP but I'm in the same market, 3d printing and desktop CNC for me.
Assuming the XY problem based on nothing is pointless and counterproductive and only serves to make you feel smart.
We had a similar issue at one point, and had to build something custom that cost way more than I'd like to admit. Thus, I would recommend just looking at DRO kits for CNC milling machines.
If your project is not budget constrained, than there are complete closed-loop stage solutions around:
https://www.pi-usa.us/en/
https://xeryon.com
Best of luck, and prepare yourself for sticker shock... lol =3
There's a lot of stuff that was better in the "good old days".
But to be alive when it's possible for gifted individuals to create technology like this is just incredible.
The sketchfab examples are fantastic, to be able to move around in a 3D space, like it's some kind of scifi simulation.
The mouse controls are confusing the heck out of me. It shows a 'grab' icon but nothing about it grabs as the movement direction is the opposite, feels completely unnatural.
You could probably harvest these from robot vacuums on ebay/goodwill.
These = lidar sensors
I've been toying with photogrammetry a little bit lately, specifically for scanning indoor rooms and spaces. So far I'm finding metashape the most suitable for it, but some of the precision isn't great (but I'm still improving my technique). I mostly want to convert the interior of one real building into a digital model for preservation and analysis. I've briefly considered LIDAR, but put it in the too hard/expensive bucket. This project seems to challenge that assumption.
What does the software post-processing look like for this? Can I get a point cloud that I can then merge with other data (like DSLR photographs for texturing)?
I see in their second image[1] some of the wall is not scanned as it was blocked by a hanging lamp, and possibly the LIDAR could not see over the top of the couch either. Can I merge two (or more) point clouds to see around objects and corners? Will software be able to self-align common walls/points to identify its in the same physical room, or will that require some jiggery-pokery? Is there a LIDAR equivalent of coded targets or ARTags[0]? Would this scale to multiple rooms?
Is this even worth considering, or will it be more hassle than its worth compared to well-done photogrammetry?
(Apologies for the peak-of-mount-stupid questions, I don't know what I don't know)
0: https://en.wikipedia.org/wiki/ARTag 1: https://github.com/PiLiDAR/PiLiDAR/raw/main/images/interior....
Shameless plug, but if you own an iPhone pro or iPad Pro (which have Lidar integrated), you should give Dot3D a try. It does everything you describe and we made it very easy to use.
Thank you for the reply. Unfortunately, I don't own an iPhone - maybe I can borrow one, though. Any limitations of the app or practical advice you might want to share?
I noticed that you can't use it commercially without contributing. How much do you have to contribute for that and where would someone do it?
Hi! Thanks for sharing this amazing work. I’m curious about the scalability and performance of PiLiDAR when deployed on large-scale outdoor datasets. Have you benchmarked it on datasets like SemanticKITTI or nuScenes? If so, could you share any insights on runtime, memory usage, and how well it generalizes beyond the indoor scenes used in your paper?
I think you (or me, please correct me if that's the case) misunderstood something here - this is a diy lidar scanner for data acquisition - these datasets are mostly created using rgba cameras and the point clouds are later created with some post processing step.
So it's not a model for processing data but rather a hardware hack for having a real lidar - as in real depth data.
You can throw anything you like on it.
Oh hey! This is exactly what I was looking for just a couple weeks ago! I've had parts to prototype something roughly equivalent to this sitting in my cart on Amazon for a couple weeks now, but I've been very uncertain on my choice of actual lidar scanner.
I'll have to look into this as a starting point I get back from Easter vacation
For home improvement projects, This could be quite useful for generating point cloud map of places hard to get to. Like I have drywall installations I would love to get behind and check how things look, this would be great for that.
This is a very legit and good idea. A simple stud-finder like tool to map out behind walls would be incredibly useful for folks who run cabling or whatnot.
GY-521 in particular and MPU6050 in general make quite poor IMUs. Why do you use them? And what for in this particular case? What do they do in this set up?
Do you have other sensors in the same price range that you'd recommend instead for most uses? How much accuracy improvement would you expect?
Yaw drift is my problem, so I tried a bunch of IMUs. Ones built around BNO055 seem to be alright and they are not that much more expensive. I ended up using Adafruit's.
How safe are these sorts of sensors for eyes?
I'm wondering the same. There are many reports of lidars damaging camera sensors.
How do you make it so your LIDAR doesn't interfere with someone else's LIDAR?
I think it's unlikely because both lidars have to be pointing at exactly the same place at exactly the same time and using the same frequency. Not impossible but probably not big deal.
Multi path errors are rare, and if it happens there are mitigation techniques.
Wow. Lidars have become so good. This is amazing. I had no idea
It's not obvious what the heck this is without reading into it. A full 4pi steradian scanner? a 360 degree 1 channel LIDAR? A fisheye camera plus some single channel LIDAR plus monocular depth estimation networks to cover everything not in the plane of the lidar?
It would be great to clarify what it is in the first sentence.
I believe it's a 360deg planar lidar mounted on a vertical plane, with a motor to rotate it around and slowly cover a full 4pi sphere. There's also a fisheye camera integrated in. This is a pretty common setup for scanning stationary spaces (usually tripod mounted)
its a fisheye camera plus single-channel LiDAR
It's impressive that the cost of usable LIDAR tech is well within the reach of personal projects now. The sensors used on the first self-driving cars (from companies like SICK, etc.) likely perform much better but the price point of multiple k$ is not really viable for experimentation at home.
Not to make everything political, but I wonder how the US tariffs will affect electronics-adjacent hobbies. Anecdotally, the flashlight community on Reddit has been panicking a little about this.
I'm sure most electronic hobby projects are going to be financially out of reach for many people for awhile at least. Many people who run businesses that are running small homebrew projects are struggling, too [1]. But it can be extremely hard to tell what might happen with a POTUS who seems to change his mind on what tariffs should be implemented on a whim with zero thought process put into it, no prior notice when they're going to be implemented or removed and then implemented again times 500% or whatever.
I know the Hong Kong post also recently blocked outbound packages entirely sent to the US [2], so I don't know how that's impacting shipments of tech like this & etc byt would be curious to know.
[1] Arduboy creator says his tiny Game Boy won’t survive Trump’s tariffs https://www.theverge.com/news/645555/arduboy-victim-trump-ta...
[2] Hong Kong suspends package postal service to the US after Trump’s tariff hikes https://www.cnn.com/2025/04/15/business/hong-kong-suspends-p...
Never mind hobbyists - I work in electronics R&D and my two favorite suppliers are US based even though I am not. Anxious to see how this plays out and that's not even considering our production departments.
> Not to make everything political... [proceeds to make a political statement]
For what it's worth, this type of Lidar scanner was possible to make well over a decade ago with ROS1, a Phidgets IMU, a webcam, and a lidar pulled out of a Neato vacuum (the cheapest option at the time). This would be around the difficulty of a course project for an undergraduate robotics class and could be done with less than 200 USD of salvaged parts (not including the computer). Hugin was also around over a decade ago.
It's still a nice little project!
I would not consider asking a question about the impact of current events on a market segment relevant to the discussion topic to be political. The disclaimer is presumably to encourage respondents not to drag things in an off topic direction. Ironic, considering the outcome.
This seems to be using classic formula -> get trivial, ready made component, design 3D printed enclosure and hook it up to Raspberry Pi. Instant Hacker News homepage.
So, where's your hardware project? Have you ever made one? I think you're underestimating the amount of time and effort that went into the linked project.
Not to make everything political... [proceeds to make a political statement]
Being all polite and non-political and shit is what brought us to this pass.
Never lose an opportunity to make the people who voted for the current state of affairs feel isolated, rejected, guilty, and generally bad. Being nice to them doesn't work.
Please, I don't want to come on to HN to see politics injected into everything. Stay on reddit for that.
I logged in to make a comment regarding something within my area of expertise: the technology present in the parent link and how this technology has been accessible to hobbyists for over 10 years.
>I don't want to come on to HN to see politics injected into everything
If it's political to wonder how tariffs impact the cost of the project we're discussing, then everything is political, and it's pointless to complain about politics being "injected into everything."
lobsters might your place if you would like to insulate yourself to that degree
You’re not making me feel isolated, rejected, guilty, or generally bad.
You’re feeding into the confirmation bias I already have about how the opposition thinks, which only serves to affirm the choice I made.
>You’re feeding into the confirmation bias I already have about how the opposition thinks
It's wild that you acknowledge your cognitive bias and then blame others for it instead of working on it. If I wrote something like that, I hope I would have the wherewithal to notice that something is seriously wrong with my thinking.
We all exhibit cognitive bias.
I’m illustrating how the original behavior feeds confirmation bias instead of establishing a basis for constructive dialog.
Yes the opposition thinks evil is evil. The opposition also thinks water is wet. Check back here tomorrow for more obvious things rational people think.
The opposition reductively believes this is an existential battle between “good and evil”, they’re the “good”, and that’s a position from which one can justify almost anything to eradicate “evil”.
Well, Trump is the one that almost always frames things in very binary way. If someone contradicts him, it is "fake news". His opposition is typically much less so, and much more rational and thoughtful.
Even many in the opposition agrees with many of his goals (control immigration, protect American industries, shrink the government).
How many Supreme Court rulings does it take for a Trump supporter to admit the Trump administration is unjust? The world may never know.
You can always know, if you want to, by actually engaging in constructive dialog. Which probably isn’t going to happen in this thread because it’s ostensibly about a raspberry pi LiDAR scanner, and thus neither really the time nor place.
The MAGA crowd is not even remotely interested in 'constructive dialog' and is so far down the hole of drinking the kool-aide, constructive dialog with them will likely never be possible.
You cannot have constructive dialog about astronomy with someone who thinks the sky is made of green and purple polkadots because that's what someone told them, and dismiss all evidence to the contrary as a massive conspiracy.
They don't even believe in democracy or constitutional rights - at least, for anyone but them.
I’m interested in constructive dialog, and I believe in democracy and constitutional rights. However, this is a thread about a neat LiDAR scanner.
It's funny - first you call me reductive but now it's all "I'm staying out of this one". Interesting how that goes.
Which many people could have afforded to build a few weeks ago, but now can't.
It's true, a Hokuyo or a Sick that sold for several thousands a decade ago is laughably bad compared to something under $100 from Shenzhen these days. When there's a need there's a way, I guess.
I hope they decide to develop some disruptive stereo/structured light/tof cameras eventually too, those are still mostly overpriced and kinda crap overall.
Short term there's some suffering but while hobbyists are definitely more price sensitive, they are also the most flexible ones. In production you don't just need one piece, you need a steady supply and any change of components affects the whole product.
How China/US interact will determine the longer term future of that economic relationship but many companies are already adjusting because he future is currently uncertain. With the free trade agreement with the EU and more producers moving to the US I think that it's been a good disruption even if I'm now also scrambling to find alternative PCB manufacturers.
>With the free trade agreement with the EU
There is no such agreement.
>more producers moving to the US
How many will follow through with these announcements? During Trump's first term, announcing huge projects in the US and then not following through was a common tactic for companies dealing with Trump. Foxconn, for example, announced a new $10 billion factory in Wisconsin. They made some initial investments and stopped when people stopped paying attention. Instead of the promised 13.000, they now employ about 1.000 people there.
And what about all the companies that will have gone out of business by then? This mainly affects small companies, which are exactly the companies you need for a healthy economy. In some cases, they have shipments already paid for that they can't accept because they don't have the liquid assets to pay the unexpected tariffs, so these companies are now at risk of going out of business completely unnecessarily.
It never makes sense to use tariffs for economic reasons. It just does not work. Tariffs can make sense for strategic reasons if you're willing to take an economic hit to lower dependence on other countries for critical industries or technologies. However, the idea that taxes are ever "a good disruption" for the economy does not bear out.
>It never makes sense to use tariffs for economic reasons. It just does not work.
This week two USA companies from which I bought some products from Europe sent me an email explaininig how they have to rise their prices due to tariffs, as they need to import from China for now.
Guess who will be faster: these companies finding an alternative supplier in the US that match China quality-price, or I finding an alternative supplier from China? They just admited that they are buying from China anyways.