FLL Resources¶
Mr Schaefer's google sheet - Veracross course page - MS FLL site
FLL Unearthed site - Tech warriors - PrimeLessons - Build books
Bock code for driving the robot (outdated)
Python code for driving the robot - driving straight, turning, arm movements, line finding.
Robot designs:
-
Very simple but effective base design (but older robot)
-
Compact and well designed. With a lift attachment, but the mounting accommodates other attachments
-
Search for FLL masterclass on youtube
-
CMU spike prime resources
General Lessons¶
Driving is key.
- The only way to get a lot of missions done is to do several in one run, using a single attachment.
- That only works if the driving is precise. See notes on driving below.
Separate driving from executing mission.
- Mark where robot is supposed to stand with tape on the table
- Get the mission going without driving
- Separately add code for driving to the mark
Robustness
- Add wiggle to the robot position and execute a few times.
Start the robot from a fixed position for all missions. Align using a wall.
Coding tricks using blocks¶
- Functions cannot return values, but one can simulate return values and avoid side-effects by always storing output values in
vOut
. The flow is then:- Call a function that computes a value.
- Copy
vOut
into a variable with a name that makes sense.
- Bool inputs
- They are awkward to create when calling a function. One has to input a comparison, such as
0<1
to get atrue
. Makes the code hard to read. - For user-facing functions, it is generally easier to use string inputs ("left", "up", etc.)
- They are awkward to create when calling a function. One has to input a comparison, such as
Software (Python)¶
Python is much easier to write and organize than block. It also has more powerful functions, such as moving motors to relative positions.
In the Spike app, the python editor is a very stripped-down version of VS Code. There is no documentation, but some VS Code keyboard shortcuts work. Code navigation is poor. Best to use find.
Useful: the app provides hover hints for Spike functions. Not sure that works in VS Code.
Organizing code into files is possible, but very tedious. See prime lessons.
Editing in VS Code¶
It is possible to write the code in VS Code (but not Windsurf) with the Lego Spike extension. Somewhat tedious: One has to manually upload changes to the hub before each run.
Workflow:
- Edit as usual.
- Upload to a slot on the hub (using the command panel).
- Run on the hub using either command panel or the "run" button in the extension menu bar.
- There is a way of placing a comment into the file to auto-run on upload.
Unresolved¶
- Can one run different missions using the same code?
- Since there is a lot of shared code, one has to hand-edit the code before uploading to each slot for each mission. Is that avoidable?
- One option would be to detect the slot while running the Python program. But that does not seem possible.
- Error messages are usually not displayed in terminal. Serialization error. Basically means that debugging has to be done in the Spike app.
Documentation and Tutorials¶
There is no complete documentation of the API.
- Knowledge base as website
- Some python code for driving (not very well written, IMHO) - useful for inspiration.
For Python:
Precise Driving¶
This is key! Get that right first.
Resources:
- Medium¶
Principles¶
Keep track of the robot position¶
Think of the table as an (x,y) plane. The the left lower corner is point (x=0, y=0). Moving right increases x
. Moving up increases y
.
Set up the robot in the start position. Record its (x,y) position in variables (xPos,yPos). Each time the robot moves, update the (xPos,yPos) variables. That way, you always know where the robot is on the table. Driving to any (x0,y0) on the table, such as the next mission, then simply means driving along a vector (x0-xPos, y0-yPos). Write a function for that.
Reset to known position¶
Over time, the robot position gets less and less accurate. Use the black-and-white lines on the table and the walls to recalibrate (xPos,yPos) whenever possible.
Don't touch the yaw angle¶
Always set the gyro yaw angle so that 0
means the robot points "north" along the y-axis.
A lot of code constantly resets the yaw angle to zero. That makes it very hard to keep track of which way the robot is pointing at any point in time. Set the yaw angle to 0 at the start and then only reset it when the robot is in a known position.
Driving functions¶
Main user-facing functions:¶
- DriveToXy`: Drive from current position to a given (x,y).
- This is the main function used for driving.
- Turns the robot into the direction of point (x,y). Computes the distance, and drives straight using that gyro.
- Currently does not handle driving backwards. To do that, turn the robot in the right direction using
TurnToAngle
and then useGyroDistance
to drive in a straight line.
GyroDistance
: Drive in direction of current yaw angle for a given distance.- Requires that the conversion from wheel rotations to distance is set correctly (see
Calibrate rotations
function). For medium wheels, the conversion is 17.5cm per rotation.
- Requires that the conversion from wheel rotations to distance is set correctly (see
Helper functions:
GyroDistance
: Drives straight in the current direction using gyro.- The function can handle driving backwards (set
speed < 0
).
- The function can handle driving backwards (set
Functions for turning:¶
-
TurnToAngle
: Turns until the yaw (converted into 360 degrees) equals a given angle.- Decides automatically whether turning left or right is shorter.
-
TurnRight
: Turns right to a given angle, even if a left turn is shorter. TurnLeft
: Same turning left
Helper functions:
DxDyToAngle
: Convert a vector, described by(dx, dy)
into an angle (0 to 360 degrees).- The math uses
atan
for the conversion from the slopedy/dx
to an angle. atan2
would simplify the math, but is not available.
- The math uses
AngleToDxDy
: Does the reverse calculation.
Functions for line detection¶
LineFinder
: Finds either a white or a black line, using either the left or the right sensor.LineAlign
: Once a line has been found, rotate the robot until the other sensor also sees the line.
Driving straight¶
Use the gyroscope. The internet has code that uses gyro to drive in straight line. The robot wobbles without the gyro. See prime lessons.
Tricks:
- A good strategy: Use the built-in turn function as a first pass at high speed. Then use the gyro to make the turn more precise at low speed.
How to drive a fixed distance:
- Without gyro: Set the mapping from motor rotations to distance using the built-in block. Then just use the "move a distance" block.
- With gyro: Not clear how to run a loop up to a certain number of motor rotations. Need to calibrate robot to translate time and speed into distance.
To do¶
- Make a test track, so that driving code can be tested when changed.
- Can one accelerate / decelerate smoothly while driving a precise distance? Less wheel slippage.
Turning¶
The gyro yaw angle ranges from -180 to +180. That makes the math awkward. For example, turning from +170 degrees to -170 degrees is a 20 degree right turn. It is best to always work with 0 to 360 degree angles instead. It simplifies the math.
Note that the gyro has "drift." It may take some time for it to settle and produce a precise reading. Should wrap reading gyro into a function that repeats reading until it settles.
Note that the robot tends to overshoot when it turns because it takes a bit of time to stop.
Line detection and following¶
Sample code from Lego Ed. Prime lessons
Can either look for color (black or white) or look for change in reflectivity of surface. Reflectivity seems more reliable. Since most of the table is light-colored, it is usally best to look for black lines.
Drive into the vicinity of the line before starting the line finder. That way, the chance that the light sensor gets confused by other colors along the way is reduced.
With two light sensors: Start with, say, right seeing black and left seeing white. When color or right sensor switches to white, the robot has drifted left and needs to steer right.
Tasks:
- Follow to end of black and white line - how?
Driving arcs (not implemented)¶
Instead of driving + stopping + turning + driving why not drive in arcs?
Function: drive linear distance X in direction Y (degrees). End up with yaw of Z degrees.
Need: Convert (X,Y,Z) into circle radius and distance to drive.
Function: drive along a circle with radius R for distance D
Approach: - inner wheels drive circle with radius R-W - outer wheels drive radius R+W - where 2W is the distance between wheels - relative length of the two diameters determines relative speed of the two wheels (rotations per second)
Use gyro to make arcs precise Need yaw as function of distance driven
All of the math can be worked out as function of radius and fraction of circle driven (terminal angle)
Robot Startup¶
Before turning it on, let the robot sit for a few seconds on a perfectly horizontal surface. Otherwise, the gyro does not initialize correctly.
After booting, set yaw to 0. Then never touch it again, until it can be set based on a known position (against a wall or line).
Robot Design¶
The Coop bot is not a bad starting point. It is compact, can handle two attachments and light sensors, and is well balanced.
The Coop bot's light sensors are partly obstructed and therefore don't work properly. The area around the light sensors needs to be redesigned. It also needs wire management.
The robot should have a completely flat back side, so it can be aligned easily against a wall.
Make sure weight is distributed so the robot does not tip when starting or stopping. That's a problem with the Advanced Driving Base.
Attachments¶
Mounting efficiency is key. Build a complete box around the bot, so that we have attachment points at top of bot. Simply drop it on and let gravity hold it.
An easy attachment: arms Have motors pointing sideways at front of bot. Simply stick arms directly into the rotating pieces (probably need to attach one piece to hold an axle for that?)
Avoid many gears. They introduce slack.
Forklift attachment It can press levers, lift stuff, push stuff, scoop stuff (capture and drag using a wide attachment) Highly versatile
Moving wall attachment¶
Can move horizontally and vertically - highly versatile. Could likely complete nearly all 2025 missions. More compact version. There are no build instructions, but one can follow along based on the video.
Main downside: The attachment cannot be exchanged. It is an integral part of the robot.
Moving attachments¶
During a run, arm positions are fully repeatable.
Motor positions are absolute. They do not get set to 0 when the robot starts.
During robot startup (before attachments go on), move all attachment motors to 0.
Move the attachments by moving the motor to fixed positions. This makes the movements repeatable. The attachments can be moved to a known position, regardless of their current position after a task.
On the current robot (coop):
- start with arm down and motor in position 0 (motor D)
- counter-clockwise is up
- 270 is about as far up as it goes
- angles close to 0 tend to roll over to 359. Round those angles to 0.
Archaeology Ideas¶
Sifting¶
Claude research report - ChatGPT deep research report
Dr. Shebalyn points out two problems with sifting:
- Requires water to sift fine material.
- Requires power. Therefore often done manually.
Pros of wet sifting:
- Separates items that stuck together. Fewer artifacts are missed.
- Cleans artifacts. Easier to see.
- Makes lumpy soil siftable.
Cons of wet sifting:
- Water often not available. And one needs lots of it. Even recycling water cuts water use only about by half.
- Takes more time. Samples usually need to be dry-sifted first. Then they get lightly soaked in buckets. Only after they have been sitting for some time, can they be sifted again. Finally, samples need to get dried.
- Some artifacts cannot get wet.
Possible benefits or air sifting (our proposal):
- Can separate clumped items to some extent. Because material is vigorously moved around.
- Not labor intensive. Can automatically sift by running sieves through the agitated material.
Limitations:
- Fragile artifacts could be damaged. To some extent also true for conventional dry sifting.
- Does not work on all soils; e.g., clay.
Other Ideas¶
Samples are fragile. Freeing them from the surrounding material takes a lot of time (manual labor). Could that task be partially automated?
Reassembling fragments. Could that be done with 3d scans and software. Probably not a new idea. May be a newer idea: make it into a computer game.
Related: Some residues are just tiny fragments or even stains that are left. How to identify those from just surrounding dirt.
How to investigate site without disturbing it.
Finding sites: Researchers have converted scanning of aerial images for buried sites into computer games. So far, this has only been done for photographic images, which don't work in rainforests. Applying the same idea to LiDAR would work in rainforests. But the innovation is pretty small.
- Or crowdsourcing.
- Someone mentioned the idea of a robot with ground-penetrating radar. That exists!
Preparing fossils. Very time consuming. Why not train a model to recognize the parts that are certainly not fossil and let a robot remove those. Leave the tricky details to humans (for now).
Related idea: Robotic surgery enhances human precision by running movements (which have to be tiny) through actuators that smake the movements smaller (and could build in safeguards against cutting the wrong parts). Why not apply that to archaeology?
Mechanical sifter - why does it not exist? Power access? Make a chain of sifters. Exists for water screening.
- the idea of dry liquid
Recycle the water used in wafter sifters. Solar energy.
Taking photos with uniform light is hard.