Explore Your Understanding

Match the Concepts

If you understood this section well, you should be able to match the concepts effectively. This exercise will help reinforce your learning. Try to match each technique/Concept with its corresponding description before clicking the 'Answer' button. Once you’ve made your matches, click 'Answer' to see how well you did!
Duration: 15-20 minutes
1. Techniques & core definitions
Technique Description
A. Transfer Learning1. Uses task-specific knowledge to predict new, unseen categories
B. Active Learning2. Selects the most informative unlabeled samples for annotation
C. Zero-Shot Learning3. Integrates prior biophysical or ecological models
D. Process-Aware Learning4. Uses pretrained model weights as initialization
Answer Key:
• A – 4
• B – 2
• C – 1
• D – 3
2. Learning approaches & label strategies
Technique Description
A. Weakly Supervised Learning1. Combines a small amount of labeled data with a large amount of unlabeled data
B. Semi-Supervised Learning2. Uses coarse, noisy, or incomplete labels to train models
C. Self-Supervised Learning3. Learns representations using pseudo-labels or pretext tasks
D. Few-Shot Learning4. Learns to generalize from very few examples per class
Answer Key:
A – 2
B – 1
C – 3
D – 4
3. Application contexts
Technique Example use case
A. Active Learning1. A system flags uncertain areas in an image for expert labeling
B. Ensemble Learning2. Multiple models trained on different seasons are combined
C. Process-Aware Learning3. A model learns using crop growth simulations alongside satellite data
D. Zero-Shot Learning4. A model identifies a new crop class it wasn’t trained on
Answer Key:
A – 1
B – 2
C – 3
D – 4
4. What problem does it solve?
Technique Solves which challenge?
A. Few-Shot Learning1. Learning with limited samples for rare crop types
B. Transfer Learning2. Adapting models trained in one region to another
C. Active Learning3. Reducing annotation costs by prioritizing sample selection
D. Ensemble Learning4. Improving model robustness and generalization
Answer Key:
A – 1
B – 2
C – 3
D – 4
5. Key components in evaluation
Concept Description
A. Spatial k-Fold CV1. Ensures validation data are geographically independent from training
B. Random Sampling2. Often results in spatial data leakage
C. Cross-Validation Fold3. A subset used for model evaluation
D. Model Generalisation4. Ability to apply a model to new, unseen regions
Answer Key:
A – 1
B – 2
C – 3
D – 4

Short-Answer Critical Thinking

Critical thinking is essential for deep understanding. In this section, you will encounter descriptive questions that challenge your comprehension of the material. Take a moment to formulate your answers before revealing the correct responses. Click 'Answer' to check your understanding and see how many you got right!
Duration: 10-15 minutes
1. Why might self-supervised learning be particularly beneficial for agricultural imagery?
Expected answer: Because large quantities of unlabeled data are available, and self-supervised methods can extract useful representations without manual annotation.
2. Explain how process-aware learning differs from other deep learning strategies discussed.
Expected answer: It incorporates domain knowledge from physical or ecological models to guide learning, rather than relying solely on data-driven learning.
3. What risks might arise if spatial autocorrelation is not accounted for during model evaluation?
Expected answer: Models may appear to perform well on validation data that is too similar to the training data, leading to over-optimistic results that don’t generalize spatially.