Capturing all around point cloud with Mid-40 and Livox Viewer

[Copy Link]
Author: WingmanMedia | Time: 2022-2-19 16:46:34 |
9 3946

1

Threads

9

Posts

35

Credits

Kindergarten

Rank: 1

Credits
35
Posted on 2022-2-19 16:46:34| All floors |Read mode
Last edited by WingmanMedia In 2022-2-19 17:30 Editor

Is there any extensive guide on how to use Livox Viewer for capturing point cloud from a Mid-40?

I was under an impression that I only needed a Mid-40 and I could just rotate it on a motorised head recording all datain Livox Viewer and thought I would get a point cloud 360 degree around assuming I completed the whole rotation. I did  a full 360 rotation recording stopping each 30 degree for 3 seconds(to get most coverage) and I completed the circle, stopped recording and saved the file.  Here is a playback  of that capture https://www.youtube.com/watch?v=f8_2kwwqwYk

If I convert the saved file and bring it in CloudCompare I just get a point cloud data represented like a cone. If I play the saved file, pause and save it it seems  I am only saving what is  curently in a paused frame. So it looks like to get all 12 point clouds(360x30) I'd need to get in a frame where each is shown and then save one by one, correct?

Now after reading a bit it seems that because a Mid-40 does not IMU it simply does  not know it is being rotated so it just does not add a new point cloud to the the one previosly captured it just replaces it in the same position, right?

Will this method work with Horizon/Avia since they have IMU built in?  And if it does how to get a full point cloud?
Jump to the end when it is all shown and save?



Reply

Use props Report

2

Threads

266

Posts

1073

Credits

Administrator

Rank: 9Rank: 9Rank: 9

Credits
1073
Posted on 2022-2-21 11:08:03| All floors
If you want to obtain a 360-degree point cloud image through rotation, it is recommended that you use the SDK to obtain point cloud data. When the lidar rotates, point cloud distortion will appear. You need to use the IMU data to correct this distortion, and you need to save it while acquiring , and splicing the point cloud data together, you can refer to my open source algorithm for mapping.
https://github.com/Livox-SDK/livox_mapping
Reply

Use props Report

1

Threads

9

Posts

35

Credits

Kindergarten

Rank: 1

Credits
35
 Author| Posted on 2022-2-22 17:27:39| All floors
Last edited by WingmanMedia In 2022-2-22 19:20 Editor

Thanks for you answer. I actually have asked a lot of questions on github but it seems there is nobody to answer any of them.

I am trying to do google search before I ask but it is all new to me and usually I cannot find anything. I am sure you know the answers so I will just ask them here:

1)SDK for Windows. It is mentioned on your website but when clicked it sends me to github Livox SDK and it is all about Linux or ROS.  Does it exist and if yes where I can download it including docs and examples?

2) Using an external IMU. I have a cheap external IMU with RS485 interface. 9 axis and no GNSS. Can I connect it to a sync port of the hub and expect data to be rotated when I use Livox viewer?

3) Using the same external IMU and recording all angles with a separate device(Android phone for example). If I get all timestamps figured out to sync data  is there anything not expensive or free that can merge point cloud data with IMU data and recalculate all point clouds for any rotation of the lidar.

4)Can the same IMU data be entered into LAS/Csv file manually? I am  not really going to type it there but I can write a windows tool that will accept point cloud data file and IMU data file and then replace  0 value of yaw angle with an actual value from IMU? I am not sure there are amy fileds for it in LAS files. Are they any or points are simply recalulated before point cloud data is written?

5)I should have probably asked it before question # 2 but is IMU data  stored anywhere for each point in point cloud data file?
If not and there is no post processing tool that can apply it to point cloud after it is captured and it can only be done during  capturing data then I guess even recording IMU data is pointless.

6) Your Lidar scanner project on github is based on DJI Manifold. Can I use any ubuntu PC?  First manifold is not avaible anyhwere and second it is just too expensive. a good ubuntu mini PC will be 4 times less.  I bet it also rotates a DJI motor used in the app, can it be eliminated and I will control when my motorised head rotates?

I do not really want to be dragged into delevepment for Linux, I do not know it well and I have no use for it apart from Livox project. I have had a lot of ASP.NET  professional experience so if there is a Windows SDK please let me know where I can get it.

So far I am just playing with changing a MID-40 angle with extrinsics calibration of it. So I capture straight ahead, save data, rotate the lidar on a motorised head 18 degrees and then change an angle manually to match rotation. It seems to work but  it would killme on a real job even with one row not to mention if I want more coverage vertically I have to do 4-5

If there is any windws SDK can I do them same but with programming? Can I chnage yaw with a code during capture?
Say I will write a code to capture first set for 3-4 seconds, stop capturing, then send a command to calibrate the lidar  yaw -18 degrees(20 sets for 360)  start capturing again for 4 seconds and repeat it until full 360 rotation  is done?

Reply

Use props Report

1

Threads

9

Posts

35

Credits

Kindergarten

Rank: 1

Credits
35
 Author| Posted on 2022-2-22 17:39:42| All floors
Last edited by WingmanMedia In 2022-2-22 17:42 Editor

BTW, I forgot  to ask about Horizon and Avia that have IMU on board. Will it be easier to do rotational scanning with any of them in terms of orienting a point cloud when lidar is rotated?

I only have gone with Mid-40 because I want to try your lidars first and it is much cheaper. However if Avia or Horizon can account for yaw during capture based on their internal IMU may be one of them is a way to go for me and I just need to save to get either of them.
Reply

Use props Report

2

Threads

266

Posts

1073

Credits

Administrator

Rank: 9Rank: 9Rank: 9

Credits
1073
Posted on 2022-2-23 20:12:39| All floors
1. The SDK of the windows version is the livox sdk on github. The readme contains the SDK editing process for the Windows version.
2. The livox viewer cannot see the data of the IMU.
3. If you calculate all the timestamps used to sync the data, there is nothing cheap or free to merge the point cloud data with the IMU data and recalculate all the point clouds for any rotation of the lidar. This is that you need to write your own code to achieve this function.
4. The same as the third question, you need to write your own code
5. This is related to the product instruction manual of the IMU you bought
6. You can buy an ubuntu pc
7. The SDK mainly acquires point cloud data and cannot change the position of the radar.
Reply

Use props Report

2

Threads

266

Posts

1073

Credits

Administrator

Rank: 9Rank: 9Rank: 9

Credits
1073
Posted on 2022-2-23 20:14:33| All floors
horizon and avia are built-in imu, you can obtain the acceleration and speed of the lidar in three directions by obtaining imu data
Reply

Use props Report

1

Threads

9

Posts

35

Credits

Kindergarten

Rank: 1

Credits
35
 Author| Posted on 2022-2-25 09:25:17| All floors
Livox Support Posted at 2022-2-23 20:12
1. The SDK of the windows version is the livox sdk on github. The readme contains the SDK editing pr ...

Thanks for your fast answers. Let me just clarify some of them

1. I have actually tried to compile it for Windows 10 using repository from another user. It looks like he is using copy your code and instructions completely and just calling it https://github.com/kentaroy47/Livox-SDK2.3-for-windows

I used VS 2022(the latest version) yesterday following his instructions and it did not work. It could be not just a version of VS but also windows Firewall problem but just in case I removed VS 2022, installed 2019 version, turned off Firewall and run through your instructions again. This time it worked even though it did not let me doing 64 bit version and I could compile only win32. I would suggest updating your readme for VS 2022 as it is current version of Visual Studio and some may want to start with it.

...

3. I would more than happy if I knew what algorithm to use to apply for rotatig data with a known yaw parameter. It seems simple & fast if you just apply say -18 degree yaw through   Extrinsic Calibration  to turn data 18 degrees clockwise. It is just doing it manually after each turn increases the whole 360 rotation from about 1 minute(20 stops for 2 sec each plus small time for rotation from point to point to 10 minutes and even more.

I am not just guessing here. I have already tried to do this all manually and it has worked.  And I have used Livox Viewer only to do it.

So this is what I do just to test it with one 18 degree rotation clockwise:
I capture data at 0 degree yaw for 3 seconds, I stop and save what's captured in yaw0.lvs file for example.  Then I start another capture just to perform Extrinsic Calibration, stop capturing,  select captured data, enter -18 degrees for yaw and apply it.
Then I rotate the lidar on a motorised head 18 degrees and run capture for another 3 seconds, stop capturing and save it as yaw18.lvs.

Now after converting both files to LAS or CSV I open them in CloudCompare and they are aligned pretty good. There maybe some offset but it can happen because

a)my rotator may not be rotating it exactly 18 degrees, it may be 18.5 or even 18.05.
b) parallax error can be coming into a play as I have no idea where the laser source is sitting behind the lidar window. I just try to set it close to the reflective window  on the Mid-40 but light source can be much deeper inside.


Since I have complied SDK for windows now I will see if my old programming skills are enough to make whatI want.

But as a side note, are you open to suggesting what can be added to Livox Viewer that

a) can turn it into a complete solution to turn  it into a single and effective tool to capture 360 degree data with a single lidar with entered by a used preset how it will be rotated.
b)can turn it into the same but with an IMU data coming from a sensor sitting aligned on top of any of your lidars and  providing all sensor data to a viewer through a com port

?

If you implement it you will sell much more of your lidars because it won't require any programming at all as all needed will be done in your Viewer. Considering a cost of your products there will be a lot more end users that will buy them as it only cost them your hardware, some lipo batteries and IMUs in a range of $30-500.


I am posting my progress with MID-40 on two forums with a lot of 3d/360 photographers and their users seem to be very interested. I am sure most of them would go and buy it right now if I said you only need a $600-1300 Livox Lidar, extra $300 for additional hardware and you can use free Livox Viewer to capture data 360 degree with a single unit in open to anyone  point cloud format.
Reply

Use props Report

0

Threads

8

Posts

49

Credits

Kindergarten

Rank: 1

Credits
49
Posted on 2023-1-9 23:06:59| All floors
WingmanMedia Posted at 2022-2-25 09:25
Thanks for your fast answers. Let me just clarify some of them

1. I have actually tried to compil ...

Good job !
Reply

Use props Report

2

Threads

7

Posts

54

Credits

Pre-School

Rank: 2

Credits
54
Posted on 2023-1-13 11:20:05| All floors
Hi WingmanMedia,

I may be able to help you out here. Feel free to email me. madnadirmapping AT gmail DOT com so we can talk about your use-case.

MNM
Reply

Use props Report

1

Threads

9

Posts

35

Credits

Kindergarten

Rank: 1

Credits
35
 Author| Posted on 2023-2-12 10:35:57| All floors
Thanks Madnadir, I have just sent you an email.
Reply

Use props Report

You need to log in before you can reply Login | Register

Credit Rules

Quick Reply Back to top