-
-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement trilateration of devices #21
Comments
Some thoughts on this:
|
Heheh... if there's anything this project has taught me, it's that "ideas are cheap" - it's the implementation where the real cost comes in! :-D I've been mulling over most of those ideas for quite a while, and I chose the above path because I think it might be the most viable approach. Some of my thought processes on this are:
Anyway, I could talk for ages about the what-ifs and nice-to-haves, but for a while I am going to be limited to:
Something I don't see a lot of in the prior art is addressing the "nature" of rssi noise. Lots of filtering assumes that noise is symmetrical around the signal, but rssi has the property that almost all noise (I'm guessing over 99%) adds to the distance, so if you receive a stronger signal, you should assume it's "more accurate" than any weaker signal, because all the things that mess with signal strength (propagation) tend to reduce signal, except for the rare case of multipath (reflections) that happen to be in-phase - which should be a minuscule number, since the odds are much more in favour of a reflected signal being out of phase. I probably need to read up on that a bit if I'm going to pin my entire reasoning on it! :-) I expect that once I make an official release with the recent changes (UUID/iBeacon support in particular) we might get a lot more users coming on board, as tracking android phones I think is a common desired use-case. So I want to get some of the basics bedded-down a bit better for that, like having it recover from restarts/reloads, not causing errors in logs etc - the sort of things that result in more time spent supporting it rather than working on it. Also I think my documentation is becoming less apt. The readme feels a bit long and unwieldy, so I might need to move some of that info into a separate "manual" and trim the readme down so it gets the basics across in a clear way. I'm always frustrated by projects that are hard to get a solid overview of without digging too deep. Anyway, thanks for sharing your thoughts, and I think we have similar ideas about how to progress to realising the ultimate goal for Bermuda, and I think we've got a good chance to build it into something pretty cool (actually, I think it's already pretty cool, and that's just in trying to achieve feature-parity with what else is out there). |
After some research, further notes on this line for when the time comes... It is indeed possible to perform trilateration without knowing the exact locations of the proxies, as long as we have information about which room each proxy is in. This approach is known as anchor-free or anchor-unknown localization. The basic idea is to use the distance estimates between the proxies themselves, in addition to the distance estimates between the proxies and the target device, to estimate the relative positions of the proxies and the device within the same coordinate system. Here's a general outline of the steps to follow:
This anchor-free localization approach has the advantage of not requiring any prior knowledge of the proxy locations, but it typically requires more computational effort and may be less accurate than methods that use known proxy locations. Additionally, we may need to incorporate techniques to handle potential issues like flip ambiguities (reflections in the coordinate system) and translation ambiguities (arbitrary shifts in the coordinate system) that can arise in anchor-free localization. The specific algorithm and techniques to use will depend on factors such as the number of proxies, the accuracy of the distance estimates, and the complexity of the room layouts. |
Assuming we can get the Distance Matrix Estimation in metres from each proxy for each other proxy like I posted on March 7, into an array of distance_estimates an example of the code needed might be: import numpy as np
from sklearn.manifold import MDS
from scipy.spatial.distance import pdist, squareform
# Distance matrix between proxies
# Assuming you have estimated the pairwise distances between proxies
# and stored them in a distance_matrix
distance_matrix = np.array([[0, 5, 8, 9],
[5, 0, 7, 6],
[8, 7, 0, 3],
[9, 6, 3, 0]])
# Convert the distance matrix to a condensed distance vector
distance_vector = pdist(distance_matrix)
# Convert the condensed distance vector back to a square distance matrix
distance_matrix_precomputed = squareform(distance_vector)
# Perform MDS
mds = MDS(n_components=2, dissimilarity='precomputed')
proxy_positions = mds.fit_transform(distance_matrix_precomputed)
# Print the estimated relative positions of the proxies
print("Estimated relative positions of the proxies:")
for i, pos in enumerate(proxy_positions):
print(f"Proxy {i+1}: ({pos[0]:.2f}, {pos[1]:.2f})") In this example:
Then for the device localisation: import numpy as np
from scipy.optimize import least_squares
def trilaterate(proxy_positions, distances):
"""
Trilaterate the position of a device given the relative positions of the proxies
and the distances between the device and each proxy.
Args:
proxy_positions (numpy.ndarray): 2D array of relative positions of the proxies (x, y)
distances (numpy.ndarray): 1D array of distances between the device and each proxy
Returns:
numpy.ndarray: Estimated position of the device (x, y)
"""
def objective_function(coords, proxy_positions, distances):
x, y = coords
residuals = []
for i in range(len(proxy_positions)):
x_proxy, y_proxy = proxy_positions[i]
distance_expected = np.sqrt((x - x_proxy) ** 2 + (y - y_proxy) ** 2)
residuals.append(distance_expected - distances[i])
return residuals
initial_guess = np.mean(proxy_positions, axis=0)
result = least_squares(objective_function, initial_guess, args=(proxy_positions, distances))
return result.x
# Example usage
# Assuming you have obtained the relative positions of the proxies using MDS
proxy_positions = np.array([[-2.5, 1.0],
[1.5, 2.0],
[3.0, -1.5],
[-1.0, -2.5]])
# Distances between the device and each proxy (in meters)
distances = np.array([5.0, 4.0, 6.0, 3.5])
# Trilaterate the device's position
device_position = trilaterate(proxy_positions, distances)
print(f"Estimated position of the device: ({device_position[0]:.2f}, {device_position[1]:.2f})") The result of the example In the MDS step, the relative positions of the proxies were determined, and these positions were used to define a new coordinate system. The proxy positions in the example were:
These proxy positions effectively define the origin and scale of the new coordinate system. The trilateration step then estimates the device's position within this relative coordinate system, using the distances between the device and each proxy. The estimated position However, this estimated position is not an absolute location in a physical space, like meters or feet. It's a relative position within the coordinate system defined by the proxy positions. The units of the coordinates are arbitrary and determined by the scale of the proxy positions. To interpret the estimated device position more meaningfully, you would need to map or transform the relative coordinates to physical coordinates or room boundaries. This could involve techniques like:
So, while the estimated position (-3.05, -2.22) itself doesn't directly represent a physical location, it serves as an intermediate step in the overall process of locating the device and determining the probability of it being in each room. |
And here it is cleaned up a bit with the potentially resource heavy MDS function separated as proxies are fixed so do not need to be calculated often. Added a plot and some x/y mirroring flags to help with testing. import numpy as np
from scipy.optimize import least_squares
from scipy.spatial.distance import pdist, squareform
from sklearn.manifold import MDS
import matplotlib.pyplot as plt
def calculate_proxy_positions(proxy_distances):
# Handle missing data in proxy distances
proxy_distances_masked = np.ma.masked_invalid(proxy_distances)
proxy_distances_filled = proxy_distances_masked.filled(proxy_distances_masked.mean())
# Perform multidimensional scaling
mds = MDS(n_components=2, dissimilarity='precomputed', random_state=42, normalized_stress='auto')
proxy_positions = mds.fit_transform(proxy_distances_filled)
# Print the results
print("Proxy positions:", proxy_positions)
print()
return proxy_positions
def localize_beacon(proxy_positions, proxy_names, beacon_name, beacon_distances, mirror_x=False, mirror_y=False):
def trilateration(distances, positions):
def error(x, c, r):
return np.linalg.norm(x - c, axis=1) - r
l = len(positions)
S = sum(distances)
# Approximate initial guess
x0 = [sum([positions[i][0] * distances[i] / S for i in range(l)]),
sum([positions[i][1] * distances[i] / S for i in range(l)])]
res = least_squares(error, x0, args=(positions, distances))
return res.x
# Handle missing data in beacon distances
beacon_distances_masked = np.ma.masked_invalid(beacon_distances)
valid_indices = np.where(~beacon_distances_masked.mask)[0]
valid_beacon_distances = beacon_distances_masked[valid_indices].data
valid_proxy_positions = proxy_positions[valid_indices]
# Calculate beacon position using trilateration with valid data
beacon_position = trilateration(valid_beacon_distances, valid_proxy_positions)
# Calculate distances from beacon to each proxy
distances_to_proxies = np.linalg.norm(beacon_position - proxy_positions, axis=1)
# Calculate probabilities based on inverse distances
# Note: Probabilities are more an indicator of relative proximity than a true statistical probability
inverse_distances = 1 / distances_to_proxies
probabilities = inverse_distances / np.sum(inverse_distances)
# Mirror the positions if requested
if mirror_x:
proxy_positions[:, 0] *= -1
beacon_position[0] *= -1
if mirror_y:
proxy_positions[:, 1] *= -1
beacon_position[1] *= -1
# Print the results
print("Beacon name:", beacon_name)
print("Beacon position:", beacon_position)
print("Distances to proxies:", distances_to_proxies)
print("Probabilities:")
for i in range(len(proxy_names)):
print(f"{proxy_names[i]}: {probabilities[i]:.3f}")
# Plot the positions
plt.figure(figsize=(8, 6))
proxy_x = [x for x, y in proxy_positions]
proxy_y = [y for x, y in proxy_positions]
x_lim = max(abs(min(proxy_x)), abs(max(proxy_x)))
y_lim = max(abs(min(proxy_y)), abs(max(proxy_y))) * 1.1
lim = max(x_lim, y_lim)
for i, (x, y) in enumerate(proxy_positions):
plt.scatter(x, y, marker='o')
plt.annotate(proxy_names[i], (x, y), textcoords="offset points", xytext=(0, -15), ha='center')
beacon_x, beacon_y = beacon_position
plt.scatter(beacon_x, beacon_y, marker='*', s=200, c='r')
plt.annotate(beacon_name, (beacon_x, beacon_y), textcoords="offset points", xytext=(0, -15), ha='center')
# plt.legend()
plt.xlabel('X')
plt.ylabel('Y')
plt.xlim(-lim, lim)
plt.ylim(-lim, lim)
plt.gca().set_aspect('equal', adjustable='box')
plt.title('Positions of Proxies and Beacon')
plt.show()
return beacon_position, distances_to_proxies, probabilities
# Proxy data in metres from each other
proxy_names = ['Office', 'Garage', "Living room", 'Gate', 'Kitchen']
proxy_distances = np.array([[ 0.0, 8.0, 11.0, 17.0, 9.5],
[ 8.0, 0.0, 15.0, 19.0, 15.0],
[11.0, 15.0, 0.0, 7.0, 2.5],
[17.0, 19.0, 7.0, 0.0, 9.5],
[ 9.5, 15.0, 2.5, 9.5, 0.0]])
proxy_positions = calculate_proxy_positions(proxy_distances)
# Beacon data in meters
beacon_name = "iPhone1"
beacon_distances = np.array([2, 9, 10, np.nan, 8.5])
beacon_position, distances_to_proxies, probabilities = localize_beacon(proxy_positions, proxy_names, beacon_name, beacon_distances, mirror_x=True, mirror_y=True)
beacon_name = "iPhone2"
beacon_distances = np.array([2, 6, 10.5, 16, 9.5])
beacon_position, distances_to_proxies, probabilities = localize_beacon(proxy_positions, proxy_names, beacon_name, beacon_distances, mirror_x=False, mirror_y=False) |
Now that's exciting! Amazing stuff :-) For some reason I hadn't seen your post from March7, either. So glad you were able to find the algo's that I was trying to describe! I knew that once we had estimates for each side of each triangle we'd be able to reach a solution, and I was pretty sure numpy or others would have those things but was at a loss of how to find them. This is awesome. MDS is exactly what I was trying to describe previously - once we have a "relative" solution for all the distances, we simply scale against the known real-world datum points. I also really like that you get a probabilistic result, too - since our measurements are inherently noisy I definitely wanted to be able to return a "radius of confusion" for the estimates. Re the inter-proxy distances, the shelly's must be acting as beacons, which the esphome proxies do not. I tried setting them up to do so but in the config at least the beacon and proxy roles are mutually exclusive. I don't think this is an inherent limitation though, just one that esphome hasn't tackled due to probably not foreseeing a use-case. That's super handy for the shelly's though! The last week or so I've been thinking that Bermuda might be at about the right stage to start tackling the trilateration, with the core functionality reasonably bedded down and the filtering being probably "good enough", so I'm keen to start working on this - your timing is perfect! :-) I think I'll start with implementing a I'll also get on to gathering the inter-proxy distances (directly in the case of shellys and indirectly via triangle-inequality for others). We might need the facility to tag certain proxies as "mobile", for example if a proxy is mounted to a robot vacuum :-) we'd need to exclude them from the static proxies used for the solution. Thanks again for this amazing work! |
Great idea to put it in its own .py. I haven't developed any integration and what little looking into it I have done, it seems quite a thing to wrap your head around. Hat tip to you Ashley. So yes that would help me and friends give our input. I've made a few very capable friends who've helped with this, particularly a chap named Claude. The probabilistic result is what I was aiming for, but what we actually have now is a normalised estimation of proximities rather than a true statistical probability. But for the purposes of this project, it should hopefully be sufficient. For the proxies, maybe this'll be a good excuse to automate some light switches or power points. I find them cheap and very reliable. They report changes (switch, temp, power, etc) immediately (less than 500ms), the connection is live I believe. The Gen2+ ones have two radios so this can be used for two WiFi connections or other but this most likely helps with detecting bluetooth without interrupting the WiFi connection. Highly recommend the gen2+ models. They've also helped me get a comprehensive picture of live power and energy usage (kids hate it, lol). Yes, I am the same, I do not see my three ESPHome M5Stacks but all (I think) Gen2 Shellys (Gen1 doesn't have bluetooth) and some others like smart speakers, a water sprinkler, door locks and such. There are actually 110 more devices I can select, but I suspect the majority are people or cars passing by outside the gate. Might be an idea to have a configurable expiration date on these things some day. Point is, people may have many more devices linked to rooms that, although won't proxy, will aid in trilateration of the rooms through clustering of the devices in a room to get a mean location of the room/area and its boundaries. For example, in my living room I have 2 shellys and a smart speaker and they are all up against the room boundaries (walls) so anything that is estimated to be within those beacons can be regarded as actually in the room. More definitive than just near the room. Hence, it may not be needed but would be good to keep in mind the facility to add other tags, in addition to fixed/mobile, such as against wall vs. inside room. Let me know when you're ready for me to look at the new file or any questions about the code above, of course. |
Awesome Hmm I flashed all my shellies (Gen3) with esphome. Following this thread! |
Ashley, do your esphome proxies see an other static Bluetooth devices in your house like mine do (speakers and such)? If they are in specific rooms, perhaps they can be used to provide some trilateration data too. First principles tells me that if proxies in specified areas can calculate distances to something in a specific fixed area, it can be used to provide data that would help with the trilateration. Just trying to think of ways that you could get some distances in your home (in a repeatable way for general users) for testing. |
Oh yeah, heaps of stuff. Including neighbours driving past (someone has a corporate vehicle which has an ODBII interface attached to it). I deliberately retain all seen bluetooth devices in Bermuda's internal state, as even though a user may only be interested in a small number of devices, I knew that it would be helpful to leverage data from all devices to build a sense of the environment. So the plan was always to keep all adverts seen, so that the data can be used to refine estimates regardless of whether those particular devices are of interest to the user. It may become helpful to allow the user to tag certain devices as "known-static-devices" to serve as datum-points, but there's also a lot that can be done just by considering the overall "picture" of devices, such as when a proxy gets occluded by something, we'll see many device ranges to that proxy increase, so the algo's can weight that accordingly. Some of this I think will happen for free, some will take deliberate incorporation - but it's been a core idea from the start, and why I've always believed that this could work despite people saying that rssi measurements are too noisy - because we have a lot more measurement points than first appears obvious, we can use that to smooth out some of the ephemeral glitchyness. I also have the initial bits of doing some "fingerprinting" of devices, so that certain device types can be automatically tagged as being "static" - google home speakers, chromecasts, appleTV, door/window sensors - all those product types can typically be assumed as being physically static in order to establish some fixed points of reference to scale everything else around. |
Raised issue in esphome for allowing ble proxies to also act as beacons (thus providing inter-proxy distance estimations) at esphome/feature-requests#2689 |
Some research on BLE-Based Indoor Positioning: |
Cheers, I'll take some time to read the properly when I can. The first looks interesting after looking at the abstract, as I was thinking that kalman filters might become useful in the latter stages of the pipeline (I don't think they're well-suited to rssi values directly, but as a step later in the processing they may help with restoring some latency / semi-predictive aspects). The second one I'll have to try again when I am less tired and less grumpy - it reads badly like it was put together by an LMM switching between basic and overly-specific levels of info, and some of the claims are... less than well-supported, like their conclusions about tx power and interval... very dodgy reasoning there, as far as I can see right now. But that might be just me :-) I'll read more closely when I can take the time to process it properly. Thanks for the tip-off! |
looks like this request has been merged |
Some interesting developments in Bluetooth 6 with Bluetooth Channel Sounding: https://www.bluetooth.com/channel-sounding-tech-overview/ |
This is the primary goal of this integration, but the first milestones to hit are:
Area
of a device, based on theArea
of the closest bluetooth receiverdevice_tracker_ble
functionality of providinghome|not_home
state for Person entities.The milestones for this feature are:
Improve filtering of distance estimates for accuracy and stability. Currently thinking Extended Kalman Filters with FilterPy? I wonder if Wouter is looking for projects :)(current filtering seems reasonably stable while also responsive. I don't think Kalman is well-suited to the noise distribution, so this is "good enough" for now).Extension goals:
The text was updated successfully, but these errors were encountered: