Brainstorming: Improving navigation without using GPS

Fürst Ruprecht

Well-known member
I would like to think with you about ways to improve navigation / positioning accuracy (without GPS). If possible, the intelligence should ride on the mower.

Magnetic field of the wire: is it possible to generate a magnetic field with the wire whose distribution of the characteristic field strength allows a conclusion to be drawn about the position/coordinate?
For this purpose, several perimeter sensors could scan the spatial axes.

Camera: there are already solutions that recognise the position of defined objects in the image. Is it possible to determine the position of the mower by positioning these (different) objects on the lawn and triangulating the results (of two images each)?

Gruß Fürst Ruprecht
 
Ich habe mal ein Beispiel gesehen, da hat der Roboter seine Karte selber erstellt aus kleinen Vierecken in der Größe des Roboters. Sobald er eine Roboter länge gefahren ist, hat er ein neues Viereck gespeichert. Link habe ich leider nicht zur Hand.
 
-Kritische Bilder aufnehmen und mit Verhaltensvorschrift versehen (wenn Baumstamm -> fahre Rückwärts rechts)
-nach jeder Fahrstrecke x-cm, Bild aufnehmen und mit kritischen Bildern abgleichen - Bild erkannt, Neues Bild mit vorhandener Vorschrift ablegen (wegen zeitlicher Veränderung der Umwelt))
-die Vorschriften könnte man per Fernbedienung/Fernsteuerung einlernen-> beim Bilder aufnehmen die Steuerbefehle überlagern
 
Be careful.
If no RTK (less than 50 Eur)
To avoid lost your time , Please first consider the GPS accuracy.;););)
Each time you localize the robot it's a 3*3 or 5*5 meters square that you need to draw.:mad:
 
Here is my thought and the result of my tests

For GPS, I want to say that RTK at 700 Eur is OK but that a simple GPS M6N or M8N with + or - 5 meters of precision can not be used for a precise location (Only to define zones or an approximate starting point).


Regarding vision, distance measurements from objects need images of very good quality. And outside when the sun is facing the mower or night falls the result will surely be wrong.

It's old image : mower is face a window What can be analysed ???
Tensorflow on Pi4 detect a train ?? :ROFLMAO: :ROFLMAO:
11.jpg




On the other hand, these technologies are constantly evolving and certainly the Future will be able to contradict them.

@Fürst Ruprecht How long does it take to analyze the images you sent? This can work in real time ?

@Sascha When your mower run well :

you can test the GPS using Piardu by this way:
First i need to reset my last work because i break the old code.
Old working process.
GPS send NMEA sentence to DUE and DUE send it to Piardu.
Piardu simply record everything in a file for all the mowing duration.
Using software U-Center (from Ublox) you can open the file on your PC and see a satelite image of the mower mowing process.

You can test the vision process by adding a Pi camera to Pi3B+ and use the Piardu software from AZURITBER Master branch
 
With a Neo m6 v1, which I've had running on my desk for a few days, I have a drift up to 20m.
Unfortunately I can't test the BN-280, it's broken and crashes my PC. I think my TTL to USB adapter has corrupted it.
Nevertheless, I would like to use the GPS receiver to see how the robot drives in the garden. With Google Maps you can display this somehow if you save the data.
With the camera, you can certainly do a lot of things with regard to obstacles or signposts.
I once saw a video in which the robot moved to its station according to painted pictures.
 
I once saw a video in which the robot moved to its station according to painted pictures.
This is possible using OPENCV and perfect know sized picture , but You need a perimeter wire to mow so ?????
 
Not necessarily, if you put up signs at corners or certain points, then the mower knows what to do there.
So it's a kind of signpost, but I don't think you can get a boundary like the perimeter, maybe if you have a 4-sided garden.

I always read in such articles that you need a combination of several sensors (sensor fusion). But so far everything I've read has been more theoretical, and in practice there's probably nothing that works.
 
I saw a table of results where the processing times were around 20ms. But no information about the general conditions (hardware, number of images, memory, etc)
 
Coordinate system: project a grid of a coordinate system onto the mowing area with a laser (poor in sunshine). Line number is encoded into the laser beam (like audio transmission). When driving below a grid point, the mower recognizes the position.
 
were around 20ms.
It's very good but like you say what is hardware for processing.

For mowing area with hard border or perfect delimitation (like in my case : asphalt road) 2 camera 1 filming ground to detect grass and other for finding charging station can do the stuff.
A jetson nano can manage the 2 camera using CSI port.
 
From my observations. Camera only is really hard and needs lot of processing power. Did lots of tests with an Intel Realsense stereo camera and was able to build a map but localization was not best and drift by time. Also light condition changes and can become issues.
Lidar works good on small garden with lots of Walls, trees and other landmarks but fails on large areas.

GPS give accuracy up to 1,5m, which is good to detect zones. It might be exact enough in combination with perimeter wire. You will get a feeling if you covered all areas and this is the way, commerciam gps mower acts in my opinion.

By watching Bosch Indego, it mows in lanes but no camera, no gps, no lidar on Board. It seems to only stick on IMU and odometry. When it hits perimeter, it tracks for a while and estimate its position with this.

For my ROSMower, I recenty purchased an RTK GPS System. It is the most expensive but reliable solution. For 422€ you're ready to go with base and rover. If you're lucky guy and have some public ntrip caster close to your home, 211€ are the price for RTK GPS.
 
Ardusimple Kit containing a board and ublox Patch antenna. Purchased directly from Ardusimple as it is the cheapest option. 211€ plus Tax. In my case, tax doesn't matter as I purchased it as company.
I purchased two of them without any radios. My robot is equipped with Jetson Nano, comparable to RaspberryPi. Base Station is equipped with an OrangePi Zero, esp32 is also ok. It just streams the correction data through my local wifi network.
As I just needed the Kits, other components I had already there, it costs 420€ in total.
 
A single Ardusimple rtk set without any base and correction data gives accuracy of approx 2m
 
My narrowest passage is about 50 cm wide (the mower is 44 cm wide). Of course, this could have been done better in the garden planning. I have very few fixed borders. There is also no clear view. In most places the lawn merges into the neighbour's lawn or meadow. A navigation solution must be accurate to within 15cm for my use.
 
Back
Top