|
Thanks for your fast answers. Let me just clarify some of them
1. I have actually tried to compile it for Windows 10 using repository from another user. It looks like he is using copy your code and instructions completely and just calling it https://github.com/kentaroy47/Livox-SDK2.3-for-windows
I used VS 2022(the latest version) yesterday following his instructions and it did not work. It could be not just a version of VS but also windows Firewall problem but just in case I removed VS 2022, installed 2019 version, turned off Firewall and run through your instructions again. This time it worked even though it did not let me doing 64 bit version and I could compile only win32. I would suggest updating your readme for VS 2022 as it is current version of Visual Studio and some may want to start with it.
...
3. I would more than happy if I knew what algorithm to use to apply for rotatig data with a known yaw parameter. It seems simple & fast if you just apply say -18 degree yaw through Extrinsic Calibration to turn data 18 degrees clockwise. It is just doing it manually after each turn increases the whole 360 rotation from about 1 minute(20 stops for 2 sec each plus small time for rotation from point to point to 10 minutes and even more.
I am not just guessing here. I have already tried to do this all manually and it has worked. And I have used Livox Viewer only to do it.
So this is what I do just to test it with one 18 degree rotation clockwise:
I capture data at 0 degree yaw for 3 seconds, I stop and save what's captured in yaw0.lvs file for example. Then I start another capture just to perform Extrinsic Calibration, stop capturing, select captured data, enter -18 degrees for yaw and apply it.
Then I rotate the lidar on a motorised head 18 degrees and run capture for another 3 seconds, stop capturing and save it as yaw18.lvs.
Now after converting both files to LAS or CSV I open them in CloudCompare and they are aligned pretty good. There maybe some offset but it can happen because
a)my rotator may not be rotating it exactly 18 degrees, it may be 18.5 or even 18.05.
b) parallax error can be coming into a play as I have no idea where the laser source is sitting behind the lidar window. I just try to set it close to the reflective window on the Mid-40 but light source can be much deeper inside.
Since I have complied SDK for windows now I will see if my old programming skills are enough to make whatI want.
But as a side note, are you open to suggesting what can be added to Livox Viewer that
a) can turn it into a complete solution to turn it into a single and effective tool to capture 360 degree data with a single lidar with entered by a used preset how it will be rotated.
b)can turn it into the same but with an IMU data coming from a sensor sitting aligned on top of any of your lidars and providing all sensor data to a viewer through a com port
?
If you implement it you will sell much more of your lidars because it won't require any programming at all as all needed will be done in your Viewer. Considering a cost of your products there will be a lot more end users that will buy them as it only cost them your hardware, some lipo batteries and IMUs in a range of $30-500.
I am posting my progress with MID-40 on two forums with a lot of 3d/360 photographers and their users seem to be very interested. I am sure most of them would go and buy it right now if I said you only need a $600-1300 Livox Lidar, extra $300 for additional hardware and you can use free Livox Viewer to capture data 360 degree with a single unit in open to anyone point cloud format.
|
|