L15 - Satellite System in Agriculture

Gustavo Alckmin

July 25, 2025

Satellite Data for Precision Agriculture

  • Early precision farming relied on Landsat: 30 m spatial resolution, 15-day revisit, cloud interference
  • Spatial layers fed prescription maps for variable-rate fertilization, pesticide, and tillage
  • Baseline crop/soil status from surveys, field sampling, lab analysis plus point sensors
  • Critical need for frequent, cost-effective, fine-scale spatial data over heterogeneous fields
  • Emerging remote-sensing solutions: high-resolution satellites, hyperspectral, SAR
    • In WA: e.g., Sentinel-2 for reliable multispectral data, Sentinel-1 (SAR) for cloud penetration, and PlanetScope for high temporal resolution.
  • Goal: real-time monitoring of crop nutrition, growth, yield, and soil properties
    • In WA: Tackling challenges like nutrient variability in sandy soils, and monitoring large-scale broadacre crops across the Wheatbelt.

Agenda

  • Introduction to Remote Sensing Principles for WA Agriculture
  • Key Satellite Platforms & Data Sources for the WA Wheatbelt (Sentinel, Landsat, PlanetScope)
  • Application I: Mapping and Managing Soil
  • Application II: In-Season Crop Monitoring (NDVI, NDRE) for Wheat & Canola
  • Application III: Integrating SAR for Soil Moisture Estimation
  • Data to Decision: Workflow from Satellite Imagery to VRT
  • Future Trends: AI, Data Fusion, and UAV integration in WA

What is Remote Sensing?

  • Definition: Non-contact measurement of Earth features across visible and nonvisible EM spectra
  • Encompasses aerial, satellite, radar, and thermal instruments
  • Proximal sensing vs. remote sensing distinctions
  • Real-Time Kinematic (RTK) GPS for sub-cm geolocation
  • Rate controllers link spatial data to variable-rate applications
  • Applications in WA: e.g., mapping soil types in the Ord River Irrigation Area, monitoring canopy health (NDVI) in Margaret River vineyards, and delineating management zones in broadacre wheat fields.
graph LR
  A[Remote Sensing] --> B[Satellite Imagery]
  A --> C[Aerial Photography]
  A --> D[Radar]
  A --> E[Thermal Imaging]
  A --> F[Proximal Sensors]
  style A fill:#f9f,stroke:#333,stroke-width:1px

Electromagnetic Spectrum in Agriculture

  • Ultraviolet (100–400 nm) through microwave (1 mm–1 m) spectral range used in PA operations
  • Visible (400–700 nm) and NIR (700–1100 nm) bands for vegetation indices (e.g., NDVI, EVI)
  • SWIR (Short-Wave Infrared) bands are critical in WA for assessing soil and crop moisture content.
  • Multispectral vs. hyperspectral imaging: band count vs. spectral resolution trade-offs
  • Platforms like Sentinel-2 provide key bands (Visible, Red Edge, NIR, SWIR), while UAVs with sensors like MicaSense offer high-resolution data.
  • Non-imaging spectroradiometers capture point spectra across wide wavelength ranges
  • Thermal infrared (8–14 µm) sensors monitor canopy temperature and plant stress
  • Radar (3 MHz–110 GHz), LiDAR, ultrasonic and audio modalities for moisture, structure, topography and pest/equipment monitoring

Blackbody Radiance

Vegetation Spectra

Sentinel Convolution

Spectral Resolution Degradations

Spectral regions

Passive vs Active Remote Sensing

  • Passive sensors rely on ambient solar radiation, capturing spectral reflectance across visible bands (blue, green, red)
    • In WA, performance is limited by winter cloud cover in the southwest and seasonal bushfire smoke.
  • Require rigorous radiometric correction and precise image georectification to ensure quantitative accuracy
  • Performance constrained by clear-sky conditions, solar angle variability, and cloud cover
  • Active sensors emit their own illumination (e.g., LiDAR, RADAR), enabling data collection day/night and under cloudy skies
    • Sentinel-1 (SAR) is invaluable for mapping soil moisture and crop structure through clouds, crucial for in-season decisions in the grainbelt.
  • Sensing range and spatial resolution limited by onboard light-source power, beam divergence, and platform payload capacity
  • Promising integration with UAS for rapid revisit rates, yet research is needed to extend operational range and reduce system weight

Spectra by LAI

Radiances

Key Components of Satellite Imaging Systems

  • Cost-effective monitoring of crop and soil variability across paddock to regional scales
  • Spatial resolution: pixel ground size defining the smallest detectable feature
  • Spectral resolution: number and width of bands for discriminating vegetation and soil signatures
  • Radiometric resolution: digital levels per band for detecting subtle reflectance differences
  • Temporal resolution: revisit frequency critical for tracking growth dynamics and stress events
  • Sensor selection depends on the WA application: Sentinel-2 for paddock-scale variability, PlanetScope for daily monitoring of high-value crops, and Landsat for long-term historical analysis.

Practical Orbital Characteristics

  • Sun-Synchronous Orbit: Most Earth observation satellites (like Landsat and Sentinel) are in a sun-synchronous orbit. This means they pass over any given point on Earth at the same local solar time, which provides consistent illumination for comparing images over time.
  • Revisit Frequency vs. Swath Width: There is a trade-off between how often a satellite can image a location (revisit time) and how wide an area it can see (swath width). Wider swaths allow for more frequent revisits.
  • Satellite Constellations: Companies like Planet operate large constellations of small satellites (above 240 active). This allows them to achieve very high temporal resolution (e.g., daily revisits) by having many satellites working together.

Resolutions in Satellite Imagery

  • Spatial resolution: defines ground sample distance (e.g., 0.3–500 m)
  • Spectral resolution: number and width of wavelength bands (e.g., multispectral, hyperspectral)
  • Radiometric resolution: quantization levels per band (e.g., 8–12 bits)
  • Temporal resolution: revisit frequency (e.g., daily to biweekly)
  • Sensor examples for WA: PlanetScope (~3-5 m), Sentinel-2 (10-20 m), Landsat 8/9 (30 m), and MODIS (250-500 m for regional views).
  • Positional Accuracy: Expect ~5-10 m geolocation error for free data (Sentinel/Landsat), suitable for paddock zones but may require ground control for precision alignment.
  • Application impacts: crop stress detection, yield forecasting, input zoning
graph LR
  A[Satellite Imagery Resolutions] --> B[Spatial Resolution]
  A --> C[Spectral Resolution]
  A --> D[Radiometric Resolution]
  A --> E[Temporal Resolution]

Overview of Satellite Data Sources

  • Optical imagery (e.g., Sentinel-2) provides key vegetation indices like NDVI (using 10m Red and NIR bands) and NDRE (using Red Edge bands) for crop health.
  • Synthetic Aperture Radar (e.g., Sentinel-1 C-band) provides reliable soil moisture and crop structure data, unaffected by cloud cover.
  • Historical archives (Landsat back to the 1980s) to validate and refine multi-year management zones
  • High revisit frequencies (Sentinel-2: ~3-4 d; PlanetScope: daily) enabling near-real-time crop monitoring
  • Data fusion frameworks combining optical and radar to improve spatial and temporal coverage
  • Collaborative analytics pipelines from DPIRD, CSIRO, and GRDC are key for developing and validating WA-specific models for management zone (MZ) delivery.

Intro to Planet, Sentinel, and Landsat

  • Landsat 8/9: Multispectral (30 m), Thermal IR (100 m native, resampled to 30 m); 8-day combined revisit.
  • Sentinel-2: 10 m & 20 m MSI including multiple Red Edge & SWIR bands; ~3-4 day revisit over WA.
  • PlanetScope: ~3-5 m resolution with daily revisit potential, though limited by cloud cover during the winter growing season.
  • WorldView-2: Panchromatic & 8-band MSI (Coastal, Yellow, Red Edge, NIR); 0.5–1.8 m; daily revisit (High-cost commercial).
  • Historical Milestones: Landsat-1 launch (1972); LACIE multi-date yield modeling (1974–75)

Spatial Resolution Comparison

  • SPOT 5: 10 m × 10 m, MODIS: 500 m × 500 m, Aerial: ~0.5 m × 0.5 m per pixel
  • High resolution (< 1 m) reveals fine-scale features; low resolution (> 30 m) aggregates data over larger areas
  • GIS-derived raster cell size dictates analysis granularity (1 m vs. 30 m grid; 30 m pixel = 900 × 1 m cells)
  • Trade-offs: detail vs. data volume, acquisition cost, and processing requirements
  • Operational alignment: match resolution to implement scale (e.g., 90 m machinery swaths)
  • Balance spatial resolution with temporal frequency for dynamic field monitoring

Temporal Resolution Comparison

  • Short intervals yield high temporal resolution (e.g., weekly field scouting vs. monthly visits)
  • Satellite revisit cycles: daily (24 h) for high vs. 16-day intervals for low
  • High-frequency data essential for statistical process control to detect anomalies
  • Thematic detail vs. revisit frequency trade-off impacts attribute resolution
  • Inherent spatial–temporal resolution trade-offs constrain monitoring strategies
  • Integrating multi-scale datasets is critical for accurate crop stress detection

Spectral Bands Comparison

  • NDVI exploits contrast between Red (0.63–0.69 µm) and NIR (0.76–0.90 µm) reflectance
  • Spectral signature: unique reflectance patterns across bands for material identification
  • Spectral resolution defines minimum wavelength separation for distinguishing fine features
  • Spectral response characterizes sensor sensitivity and bandpass characteristics
  • Band selection influences detection of chlorophyll absorption, biomass, and water stress
  • Spectroradiometer measurements provide high-fidelity data for calibration and validation

Data Access and Licensing

  • Distinguish data ownership vs. access rights in service-level agreements
  • Evaluate licensing models (perpetual, subscription, data-as-a-service)
  • Integrate centralized on-farm database via OEM and contractor partnerships
  • Secure data transfer from onboard devices using standardized APIs (ISOXML, ADAPT)
  • Address legal and ethical considerations: usage rights and unintended interpretations
  • Recognize barriers to cloud adoption: ownership ambiguity, financial data sensitivity

Data Volume and Processing Requirements

  • On-farm storage: proprietary local databases for fertilizer, protection, and yield data; full data ownership but limited capacity
  • Remote (“cloud-ready”) solutions: manufacturer/third-party servers enabling predictive maintenance and aggregated performance insights
  • Web/cloud services: highly scalable storage and on-demand applications, rapid feature updates; underutilized due to security and ownership concerns
  • Data volume: multi-terabyte seasonal streams from sensors, GPS, yield monitors, and UAV imagery
  • Processing requirements: high-throughput data ingestion, edge-to-cloud synchronization, real-time analytics, and machine learning pipelines
  • Integration workflows: emerging standardized protocols and APIs, but trust, privacy, and provider lock-in remain barriers

Atmospheric Correction

  • Use MODTRAN-based models to estimate atmospheric transmittance in VIS-NIR bands
  • Correct top-of-atmosphere radiance to surface reflectance employing the 6S radiative transfer model
  • Incorporate aerosol optical depth data from AERONET or satellite-derived sources for aerosol scattering corrections
  • Apply water vapor and ozone absorption adjustments using temporally aligned meteorological profiles
  • Include adjacency effect correction to mitigate reflectance contamination from neighboring pixels
  • Validate corrected reflectance against in-situ spectral ground-truth measurements to maintain < 5 % RMSE
flowchart LR
  A[Raw Top-of-Atmosphere Radiance] --> B[Sensor Radiometric Calibration]
  B --> C[6S Atmospheric Model]
  C --> D[Apply Atmospheric Correction]
  D --> E[Derive Surface Reflectance]
  E --> F[Validation with In-situ Spectra]

Radiometric Calibration

  • Reflectance(λ) = [DN(λ) – DN_dark(λ)] / [DN_white(λ) – DN_dark(λ)]
  • DN(λ): raw digital-number response per pixel at wavelength λ
  • DN_dark(λ): dark current response (sensor bias correction)
  • DN_white(λ): white reference plate response (normalization factor)
  • Subtract DN_dark to remove sensor bias and fixed-pattern noise
  • Divide by dark-corrected white signal to obtain true reflectance

Geometric Correction & Orthorectification

  • Uses latitude & longitude on a spheroid to establish GCS
  • Applies ground control points (GCPs) for geometric correction
  • Considers GDOP to minimize GPS positional errors
  • Assigns geographic coordinates via georeferencing
  • Incorporates DEMs to orthorectify terrain-induced displacement
  • Refines map topology & clipping for accurate layer overlays
flowchart LR
  RAW[Raw Remote Sensing Image] --> GCP[Ground Control Points]
  GCP --> GeoCorr[Geometric Correction]
  GeoCorr --> Orth[Orthorectification]
  Orth --> OrthoImage[Ortho-rectified Image]

Cloud Detection & Masking

  • Cloud cover obstructs optical remote sensing during key crop growth stages
  • Radar sensors (e.g., SAR) and UAS imaging penetrate or bypass clouds for continuous data acquisition
  • Spatial resolution vs. coverage trade-off: high-resolution imagery captures fine detail over limited areas; low-resolution rasters cover broad regions
  • Spectral resolution ranges from RGB to multispectral and hyperspectral sensors for advanced vegetation analysis
  • Cloud masking algorithms (e.g., Fmask, QA bands) automatically detect and exclude cloudy pixels from imagery
  • Cloud-free composites integrate masked data streams to provide uninterrupted temporal monitoring
flowchart LR
  A[Satellite Optical Data] --> B{Cloud Cover?}
  B -- Yes --> C[Radar & UAS Acquisition]
  B -- No --> D[Process Optical Imagery]
  C --> D
  D --> E[Cloud Masking (Fmask/QA Bands)]
  E --> F[Cloud-Free Composite Generation]

Challenges in Using Satellite Imagery

  • Cloud cover and atmospheric scattering degrade spectral fidelity
  • Spatial resolution limits detection of sub-field variability
  • Limited spectral bands hinder discrimination of crop stress signatures
  • Radiometric calibration drift reduces temporal comparability
  • Infrequent revisit intervals miss rapid phenological changes
  • High data volumes demand robust processing and storage infrastructure

Detecting Crop Health with Imagery

  • Platforms: UAVs, manned aircraft, and satellites offering complementary spatial and temporal coverage
  • Sensors: High-resolution multispectral and hyperspectral imagers capturing narrow bands tied to pigments, moisture, and structural traits
  • Spectral Signatures: Unique reflectance patterns reveal nutrient deficiencies, water stress, and disease onset
  • Multi-Temporal Analysis: Sequential image acquisitions enable detection of symptom progression and early warning
  • Stress Discrimination: Fusion of spectral time series with soil and crop data differentiates overlapping stresses and disease complexes
  • Industry Trend: Declining sensor costs and democratization of data analytics accelerate adoption in precision agriculture

Disease and Pest Outbreak Detection

  • Decision support tools (warning services, DSS) vary in spatial/temporal resolution and data sources
  • On-site sensing: advanced computer vision, spectral imaging, and machine-learning techniques
  • Communication channels: SMS alerts, web portals, and mobile applications
  • Delivery modes: real-time notifications, periodic reports, and interactive dashboards
  • Current best practice: integrated visual scouting with instrumental sensors
  • Population-dynamics model integration for precise treatment timing and dosage

Inferring Soil Moisture from Spectral Data

  • Hyperspectral imaging systems capture water absorption features at narrow bands (e.g., 1.4 μm, 1.9 μm)
  • Multivariate regression models (PLSR, SVR, Random Forest) applied to reflectance spectra
  • RMSE improvements of 10–20 % over multispectral NDWI-based moisture indices
  • Calibration using in-situ dielectric soil moisture probes and k-fold cross-validation
  • Sensitivity to confounders: surface roughness, organic matter content, sensor noise
  • Local-scale accuracy high; challenges remain for large-area, multisite generalization

Limitations and Integrations

  • Regulatory and privacy constraints limit spatial data sharing
  • Proprietary formats create siloed agronomic datasets
  • Absence of interoperability protocols hinders cross-platform integration
  • ADAPT API: open-source toolkit for seamless data translation
  • Modular plug-ins support diverse agricultural standards
  • Facilitates unified workflows and accelerates technology adoption

Case Studies

Comparative analysis of sensor modalities in precision agriculture.

:::

Crop Health Monitoring Case Study

Overview of modern remote sensing approaches for high-resolution crop health assessment and management.

  • Platforms: Satellites, manned aircraft & UAVs capturing multispectral, hyperspectral & thermal data
  • Smart PA: Real-time sensor feedback with variable-rate applicators for site-specific fertilization
  • Operation Tracking: Embedded systems log & geolocate seeding, spraying & fertilizing events
  • Harvest-Stage QA: Remote sensing for moisture, nutrient content mapping & batch segregation
  • Novel Sensing: In-season & postharvest imaging for stress, disease, pest & weed detection
  • Data-Driven: Integrating spatial variability analytics to optimize yield, quality & sustainability
flowchart LR
  S[Satellites/UAVs] --> D[Data Acquisition]
  D --> P[Image Processing & Analysis]
  P --> V[Variable-Rate Application]
  P --> H[Harvest Quality Assessment]
  P --> N[Novel Sensing Techniques]

Soil Moisture Monitoring Case Study

  • Traffic at high moisture contents causes soil compaction and puddling
  • Remote and proximal sensors effectively monitor surface moisture but miss upper-horizon conditions
  • In-situ subsurface sensors (Cosh et al., 2012) capture moisture dynamics within distinct soil horizons
  • Soil horizon: layered zones with unique texture, structure, and hydraulic properties influencing water retention
  • Soil-moisture tensiometers measure matric tension to optimize irrigation scheduling and detect drainage issues
  • Integration with crop sensors enables predictive water-stress management rather than reactive responses

Acquiring Satellite Data

  • Raster satellite data from WEnR and NEO WCS: NDVI & WDVI biomass maps
  • Imagery applied to haulm-killing monitoring and crop vigor assessment
  • In-field sensor data ingestion: EM38 soil conductivity, Veris soil mapping, drone-derived NVDE & WDVI
  • Automatic API-based delivery directly into the user’s Akkerweb account
  • Advisory app integration for storing and reusing third-party recommendations
  • Data sharing with granular permissions: time-limited, read-only, or editable

Advanced Image Processing Techniques

  • Brovey Transform: linear fusion of high-res PAN and low-res MS bands (Zhang et al., 2010)
  • Dictionary-Based Atom Extraction: overcomplete sparse dictionary for patch-level detail enhancement
  • Wavelet-Based Multi-Resolution Analysis (MRA): decompose PAN into subbands, replace approximations with MS, inverse transform
  • MRA Advantages: high spatial-spectral fidelity, reduced distortion, strong SNR (Mather, 2004)
  • Trade-Offs: Brovey simplicity vs. spectral distortion; sparse methods control noise but require heavy computation
  • Pan-Sharpening: Use of a panchromatic imagery to enhance the spatial resolution of multispectral data.

Introduction to Vegetation Indices

  • Several spectral indices (RVI, NDVI, AVI, MTVI) monitor crop growth
  • Indices and soil parameters are evaluated for correlation with yield
  • Highest-correlating variables combined into a multi-parameter field map
  • Geostatistical interpolation and clustering delineate management zones
  • In a low-variability field, two management zones were identified
  • Validation: 83 of 99 pixels matched high/low yield zones

Normalized Difference Vegetation Index (NDVI)

  • Computes vegetation vigor using (NIR – Red) / (NIR + Red)
  • Typical NDVI range: 0 (bare soil) to 1 (dense green canopy)
  • Strongly correlates with chlorophyll content, leaf area index, biomass
  • Pixel-level mapping reveals intra-field variability and stress zones
  • Basis for delineating management zones (MZs) for variable-rate input
  • Enables precision decisions: targeted fertilization & irrigation

Enhanced Vegetation Index (EVI)

  • Builds on NDVI by integrating blue-band reflectance for atmospheric and canopy background correction
  • Formula: EVI = G * (NIR - Red) / (NIR + C1 * Red - C2 * Blue + L)
  • Reduces sensitivity to soil brightness, aerosol scattering, and leaf geometry variations
  • Optimized for broad-scale monitoring under dense canopies and high aerosol conditions
  • Provides improved dynamic range in high-biomass regions compared to NDVI
  • Widely applied in satellite platforms (MODIS, Sentinel-3) for ecosystem and crop mapping

Normalized Difference Red Edge Index (NDRE)

  • Uses red-edge (~730 nm) and NIR (~780 nm) reflectance bands
  • Higher sensitivity than NDVI in dense or late-season canopies
  • Less saturation under high biomass, improves chlorophyll estimation
  • Calculation: NDRE = (NIR – RedEdge) / (NIR + RedEdge)
  • Example: (0.323 – 0.209) / (0.323 + 0.209) ≈ 0.214
  • Enables NDRE-driven variable-rate nitrogen applications

Normalized Difference Water Index (NDWI)

  • NDWI–Hyperion defined as (R1070 – R1200) / (R1070 + R1200)
  • Sensitive to water absorption peak near 1200 nm (Ustin et al. 2002)
  • Demonstrates robust correlation with LWC (R² ≈ 0.42–0.50 across conditions)
  • Outperforms single-band indices by normalizing bidirectional reflectance effects
  • Comparable to Vegetation Dry Index and NIR index in predictive accuracy
  • Readily implementable on UAV and satellite multispectral sensors

Soil-Adjusted Vegetation Index (SAVI)

  • Incorporates soil-adjustment factor L to reduce background soil signal
  • Defined as SAVI = (NIR – RED) / (NIR + RED + L) × (1 + L)
  • Typical L = 0.5 for intermediate vegetation cover, adjustable based on canopy density
  • Enhances sensitivity to green biomass in early growth stages over bare soils
  • Demonstrated stronger correlations with leaf area index (LAI) and chlorophyll content vs. NDVI
  • Integral in precision nitrogen management by refining spatial N status estimates
flowchart LR
  A[NIR Reflectance] --> D[Num: NIR - RED]
  B[Red Reflectance] --> D
  A --> E[Den: NIR + RED + L]
  B --> E
  C[Soil Factor L] --> E
  D --> F[/ Division /]
  E --> F
  F --> G[Multiply by (1+L)]
  G --> H[SAVI]

Translating Index Values to Insights

  • ISU formula: (n₁ + n₂) / n₃ quantifies seeding uniformity
  • n₁: clusters (< 25 % of target distance); n₂: gaps (> 200 %)
  • n₃: preferred range gaps (± 20 % of target)
  • Adjustable threshold parameters align ISU with crop-specific criteria
  • Lower ISU values indicate higher spacing conformity
  • Supports seeder performance benchmarking and gap-filling strategies

Calculating NDVI in Practice

  • NDVI reliably measures crop vigor; soil background effects are minimal
  • Wheat NDVI values range ~0.4 at tillering to ~0.95 at full canopy
  • Satellite imagery: large-area coverage, rapid revisit, cloud-cover limitations
  • Drone/UAS imaging: flexible scheduling under clouds, high resolution, limited footprint
  • Key applications: nutrient deficiency, insect damage, water stress detection
  • Time-series NDVI for seasonal and multi-year crop monitoring

flowchart LR
  S(Satellite Imagery) --> P(Data Processing)
 
  P --> C[NDVI Calculation]
  C --> Z[Stress Detection & Zoning]
  C --> T[Time-Series Monitoring]

Interpreting Index Results

  • Definition and formula of I_SU: (n1 + n2) / n3
  • n1 = count of distances < 0.25× target (clusters); n2 = count > 2× target (gaps)
  • n3 = count of distances within ± 20 % of target spacing (preferred range)
  • Higher I_SU indicates poorer spatial uniformity with more clusters and gaps
  • Univariate Moran’s I: 0 = no spatial autocorrelation; ± 1 = strong autocorrelation
  • Bivariate Moran’s I evaluates spatial correlation between two variables

Google Earth Engine

GEE is THE platform/IDE for Remote Sensing Analysis in the cloud.

Integrating Results into Farm Management

  • Centralized integration of soil, sensor, and environmental data
  • Predictive modeling for crop growth, water dynamics, and disease risk
  • Farmmaps UI visualizes field-level and intra-field variability
  • Model-derived prescriptions for fertilizers, irrigation, and crop protection
  • Seamless workflow alignment with farmer management schedules
  • Continuous feedback loop refines models based on field outcomes
graph LR
  A[Soil & Sensor Data] --> D(Farmmaps Platform)
  B[Weather & Environmental Data] --> D
  C[Yield & Imaging Data] --> D
  D --> E[Variability Visualization]
  D --> F[Prescriptive Recommendations]
  E --> G[Farm Management Actions]
  F --> G

Integrating Satellite Data with Ground Sensors

  • Fusion of geophysical sensors and satellite imagery for comprehensive soil variability mapping
  • Combining high-spatial-resolution multispectral data with high-temporal-resolution satellite revisits
  • Co-located on-the-go soil probes and stationary EMI/GPR sensors for calibration and validation
  • GNSS-enabled yield monitors and crop/soil sensors supplying real-time geo-referenced data
  • Spatio-temporal data fusion via kriging, machine learning, and data assimilation workflows
  • Multi-layered soil profile characterization: structure, moisture, nutrient distribution
flowchart LR
  SAT[Satellite Imagery]
  GND[Ground Sensors]
  FUS[Data Fusion Engine]
  MAP[High-Resolution Soil Maps]

  SAT --> FUS
  GND --> FUS
  FUS --> MAP

Data Management and Big Data Considerations

  • Economic assessment tools evaluate capital investments, land rental, practice changes, and labor costs
  • Big data in agriculture features high volume, variety, and complexity requiring specialized architectures
  • Multi-year yield and cost analyses pinpoint unprofitable field zones for targeted interventions
  • Cloud-based archiving and automated backups of raw yield-monitor outputs ensure long-term data integrity
  • Clearly labeled folder structures by year and data type facilitate retrospective analyses
  • Integration with decision support systems drives timely, data-driven management decisions
flowchart LR
  A[Data Sources: Sensors, Drones, Satellites] --> B[Cloud Ingestion & Storage]
  B --> C[Data Archival & Backup]
  C --> D[ETL & Preprocessing]
  D --> E[Analytics & Economic Assessment]
  E --> F[Decision Support & Management Actions]

Emerging Technologies in Agri Remote Sensing

  • Integration of > 175 imagery layers with weather, yield, planting/spraying records, harvest details, and field notes
  • Advances in sUAS enabling low-altitude, high-resolution multispectral and thermal data capture
  • Electromagnetic spectrum coverage spans UV, visible, NIR, thermal IR, and microwave bands for diverse crop and soil insights
  • Platform diversity: satellites, manned aircraft, sUAS, and vehicle-mounted units offering spatial resolutions from sub-meter to 10 m+
  • Nonimaging spectroradiometers record discrete spectral signatures; imaging sensors (multispectral & hyperspectral cameras) capture spatially resolved spectra
  • Sensor fusion workflows support variable temporal revisits, enhancing real-time crop monitoring and prescription accuracy

Future Directions in Satellite Data for Farming

  • Integration of high-frequency small-satellite constellations for daily/sub-daily revisit
  • Advancements in SAR-based all-weather monitoring for continuous stress detection
  • Fusion of multispectral, thermal, and radar data to distinguish water vs. nutrient limitations
  • Pixel-level time-series machine learning models for early pest and disease outbreak prediction
  • Development of calibrated spectral indices for direct quantification of nutrient deficiencies
  • Coupling satellite-derived evapotranspiration and soil moisture models with AI-driven irrigation scheduling

Q&A and Thank You

  • Questions on applying QGIS in precision agriculture workflows?
  • Discussion on integrating drone and sensor data with spatial analysis
  • Queries about customizing plugins for farm management tools
  • Feedback on the open-access QGIS learning resource structure
  • Collaboration opportunities for data sharing and community development
  • Next steps: exploring tutorials, forums, and contributing back