<Advent of Code 2021 day 19> :thread:
# advent-of-code
a
k
I can't believe I corrected a single line in one of copilot's larger suggestions. And that ended up being the wrong correction, while copilot was correct, and ended up missing out on a large amount of points due to having to spend like 10 minutes to debug it
😔 1
e
It was really kind of Eric to explain how to construct the 24 rotations, but still…
e
alright, down to ~5s in my Haskell solution, now time to port it to Kotlin and see how it does…
i
When constructing rotations, I've got 48 distinct matrices, but it seems that half of them are mirroring, like
Copy code
[-1, 0, 0]
[0, 1, 0]
[0, 0, 1]
Is there a simple criteria, how to distinguish a rotation matrix from a mirroring one?
e
The determinant of rotation matrix is 1
šŸ‘ 1
To avoid the matrix math, I constructively perform rotations just how Eric described it in the problem statement (3 ways to pick the axis you look at; 2 ways its direction; 2 more ways to pick the up axis; 2 more ways to pick its dir; the final axis and its direction is determined unambiguously, but figuring its direction requires some thinking), see https://github.com/elizarov/AdventOfCode2021/blob/0bcea3df8ea792be26075dcbc369a0990458e5ee/src/Day19.kt#L82-L90
Also, this time I’ve got my motto that ā€œAoC is not about performanceā€ to the limit and implemented everything as straightforwardly as possible (e.g., my 3D point is simply a typealias to
List<Int>
😱 and I copy them all the time without ever trying to save on copying). The resulting code is quite concise, but it takes over 10 seconds to complete.
Switching to
data class
with
val x: Int, val y: Int, val z: Int
makes it significantly faster (~5 sec), but forces writing more code.
d
I’ve successfully detected which scanners overlap with which other scanners, but grinding out the beacon correlation function between scanners and from that the position relative to scanner 0 seems like it’s going to maybe take more free time than I have right now 🤣 I need some sleep
e
@Dan Fingal-Surma I’ve spent some time thinking about that math, too, but then I remembered it is ā€œAdvent of Codeā€ (not Maths!), so to transform everything into 0-th scanner reference frame, I just programmatically perform the corresponding sequence of transformations between pairs of reference frames.
n
I still think that I am still not using all the information, esp the "a scanner sees all beacons in 1000 range". That should allow to quicker determine that a delta between 2 scanners is not possible: if a beacon is in range of both but not seen by both, then the scanner delta is incorrect. But the list of possible scanner delta is still huge...
d
Ah good point — given two overlapping scanners, and 1 pair of beacons with the same offset signature in each, you can generate the 2 consistent transforms to the second scanner’s frame of reference, and then check those 2 candidates against the other points to figure out the correct 1
I still need to go to bed though 😓
n
me to (US west coast..)
same 1
e
https://github.com/ephemient/aoc2021/blob/main/kt/src/commonMain/kotlin/com/github/ephemient/aoc2021/Day19.kt mostly a straight port from my Haskell solution, but required a couple tweaks:
.sortedByDescending { (key, _) -> delta.intersect(key).size }
took 45s,
.map { (key, value) -> key.count { it in delta } to value }.apply { sortByDescending { it.first } }
takes 2.5s ā—
(I was really expecting the built-in
intersect
to be smarter and handle swapping of arguments (depending on cardinality) for better runtime on its own, but it seems it's documented to preserve order of the receiver, so it can be pretty terrible)
oh down to 2s now that I double-check this thread and the instructions to realize that reflections can be excluded
m
I have no shame in my orientations part of the "algorithm". I got the runtime down to ~750ms but finding the offset that finds at least 12 beacons is probably where I lose most my time. Funnily enough, the algorithm still works if I set MINIMALBEACONS to 3. https://github.com/mdekaste/AdventOfCode2021/blob/main/src/main/kotlin/year2021/day19/Day19.kt
is there a faster way to check set overlap than my O(n²)? it feels there is...
Copy code
totalBeacons.flatMap { c1 -> orient.map { c2 -> c1 - c2 } }
            .groupingBy { it }
            .eachCount()
            .filterValues { it >= MINIMALBEACONS }
managed to get it running under .5 seconds, but now it includes a try-catch lol
m
j
Not fastest solution but hope it's easy to read. https://github.com/Jintin/AdventCode2021/blob/main/src/Day19.kt
k
Question: How are there only 24 orientations? x can rotate 0, 90, 180, 270 y can rotate 0, 90, 180, 270 z can rotate 0, 90, 180, 270 x can be flipped y can be flipped z can be flipped So isn't there 4 * 4 * 4 * 4 * 2 * 2* 2 = 2048 different orientations?
p
There are only 24 unique rotations of a cube - may paths to reach those rotations though
āœ… 1
d
Finally finished mine. Updated solution above. 254 ms runtime. Have not bothered to refactor code.
I manually created the rotation transforms
Copy code
typealias Extractor = (Point) -> Int
val X: Extractor = { it.x }
val NX: Extractor = { -it.x }
val Y: Extractor = { it.y }
val NY: Extractor = { -it.y }
val Z: Extractor = { it.z }
val NZ: Extractor = { -it.z }

class Transform(val x: Extractor, val y: Extractor, val z: Extractor): (Point) -> Point {
    override operator fun invoke(p: Point) = Point(x(p), y(p), z(p))
}

val transforms = listOf(
    // X -> X
    Transform(X, Y, Z), Transform(X, NY, NZ), Transform(X, NZ, Y), Transform(X, Z, NY),
    Transform(NX, Y, NZ), Transform(NX, NY, Z), Transform(NX, NZ, NY), Transform(NX, Z, Y),
    // Y -> X
    Transform(Y, NX, Z), Transform(Y, X, NZ), Transform(Y, Z, X), Transform(Y, NZ, NX),
    Transform(NY, X, Z), Transform(NY, NX, NZ), Transform(NY, NZ, X), Transform(NY, Z, NX),
    // Z -> X
    Transform(Z, X, Y), Transform(Z, NX, NY), Transform(Z, Y, NX), Transform(Z, NY, X),
    Transform(NZ, NX, Y), Transform(NZ, X, NY), Transform(NZ, NY, NX), Transform(NZ, Y, X),
)
while staring at ā€œFor a 3Ɨ3 Matrixā€ https://www.mathsisfun.com/algebra/matrix-determinant.html lol
p
Finally finished tidying it up - https://github.com/tKe/aoc-21/blob/main/kotlin/src/main/kotlin/year2021/Day19.kt - happy with the ~50ms runtime for part 2 compared to my original 9s solution (which I'm annoyed I didn't commit for comparison)
t
Forgot to post this yesterday. I went down a tricky path with this where I was trying to find pairs of sectors that overlapped and then recursively remapping all the spots to get back to the base representation. Instead, I ended up just picking one sector as the base and using that to find overlaps, which seemed to work. Might strictly be more work that needed, but the computer is doing it so what do I care? Basically - pick a base sector, find the sector that it overlaps with, convert all its spots to the base representation, add them to the base sector, and stop looking at the old one. Repeat until everything is in the base set. • Code • Blog
šŸ‘ 2
n
Not sure if other people posted this, but it was not necessary to manually construct, or walk through, all the difference rotations
The distances between pairs are invariant under both translation and rotation, so basically you can create maps from distance to lists of pairs of points, for the set of point syou'r einterested in
if 12 beacons are identical, then you'll have at least 66 matches. Then from there, you can always find a matching distance that just has one pair in its list, and further, pick them such that the displacement vectors of the two are the same modulo permutation and sign (rotations, basically)
In other words, it's possible to more or less immediately determine that sensors 1 and 2 have 12 beacons in common, and also immediately find a pair that you know represents the same two beacons in both.
and then from that you can back out the rotation and the translation. The only catch is you still have to take a determinant, to see if the points are flipped in one case relative to the other
e
oh I didn't use the distances, I used the actual deltas between observation pairs; I stored a map of deltas to [sensor, rotation that produces aforementioned deltas]. then as soon as I find a good set of deltas, I already know which rotation to use for that sensor
n
Right, but then you have to iterate over all the rotations, and store the delta for all the rotations
e
I did think about using manhattan distance, but I felt like it would have too many false positives (didn't try it out). and for some reason I did not think about using (squared) euclidean distance, until this conversation just now…
n
I still wonder that no-one (incl. me) seems to use the "a sensor sees all beacons in a 1000 points distance" constraint from the puzzle. Normally, all these hints must be used for a "good" solution.
n
not manhattan distance
I actually didn't even consider manhattan distance because manhattan distance is a weird thing and it's not generally conserved under rotations, just under specific ones which happen to be the ones here šŸ™‚
euclidean distance, which of course is conserved under all rotations, which in my biased view makes it the most interesting starting point
technically, you're right, you could have false positives, but once you start thinking about how the algo works, it's basically only going to happen on data specifically engineered to confound the algo; the odds of it ever happening on real data are nearly 0.
I think what's neat about starting with euclidean distance is that with some mild modifications you could get it to do a decent job putting data back together in the generalized case of arbitrary rotations
and also, yeah, what I said before about avoiding iterating over all rotations šŸ™‚
e
at least with how my solution is structured, the part where I search for the right offset takes way longer, I'm pretty sure I'm saving time by doing that for only one rotation
p
I took the approach of capturing the "signature" (set of absolute deltas on each axis) of each pair of a detected beacon constellation, and matching them up between scanner results - when a pair matched, I iterated over all rotations and possible deltas between the two matching pairs, and confirmed the selection with the full set - avoided rotating the full set of beacons if there wasn't at least one pair that might work
same 2
e
I was wondering whether there is a faster way to do the position matching - it seems vaguely convolution-like so maybe a FFT? - but didn't spend time thinking that through
p
My first approach was to just brute force all rotations on the target constellation, and try all possible shifts until at least 12 beacon locations matched - it was a little slow but got the answer
k
i usually do whatever approach is easier for my mind to comprehend without caring about whats most optimal. ill only optimize if i need to.
n
@ephemient what do you mean by "search" for the right offset?
once you find a matching pair of points, there's no more searching to be done; the rotation can be computed directly by looking at the displacement vectors, and then once you have the displacement vector you can go back to the point and compute the offset directly (by rotating and then subtracting)
k