Skip to content

Raspberry Pi Pico W MPU6050: Send data to DynamoDB | ShillehTek

January 01, 2024

Video Tutorial

Watch first if you want to follow the full AWS setup and MicroPython code flow in real time.

Project Overview

Raspberry Pi Pico W + MPU6050: In this project, you publish MPU6050 acceleration data from a Raspberry Pi Pico W to AWS IoT Core over MQTT, route it through a Lambda function, and store it in an AWS DynamoDB table.

This architecture is mostly about connecting AWS services, so do not be intimidated if you are a beginner.

If you want to support the creator, here are the original links:

  • Time: 60 to 120 minutes
  • Skill level: Intermediate
  • What you will build: A secure MQTT pipeline from Pico W sensor data into DynamoDB using AWS IoT Core and Lambda

Parts List

From ShillehTek

  • No direct ShillehTek store product links were provided in the original article.

External

Note: This tutorial uses TLS certificates with AWS IoT Core. You will download certificate/key files from AWS and upload them to the Pico W filesystem for the MQTT client to connect securely.

Step-by-Step Guide

Step 1 - Review the architecture

Goal: Understand how data flows from the Pico W into DynamoDB.

What to do: Use this architecture: Pico W publishes sensor values with MQTT to AWS IoT Core. An IoT Rule routes messages to a Lambda function. The Lambda function writes items into a DynamoDB table.

Architecture diagram showing Raspberry Pi Pico W publishing MQTT messages to AWS IoT Core, routed by an IoT Rule to AWS Lambda, then stored in AWS DynamoDB
High-level pipeline: Pico W MQTT to IoT Core, IoT Rule to Lambda, Lambda to DynamoDB.

Expected result: You know which AWS services you will create and connect: DynamoDB, Lambda, IAM role/policy, IoT Core Thing/certs, and an IoT Rule.

Step 2 - Create an AWS DynamoDB table

Goal: Create the DynamoDB table that will store your sensor records.

What to do:

  1. Go to aws.amamazon.com and create an account (AWS free tier is available to get started).
  2. In the AWS console, search for DynamoDB and open the service.
  3. Select Create table.
  4. Set a table name (any name you like). Set a partition key and make sure the partition key is a number.
  5. Create the table using default settings.
AWS console screenshot showing DynamoDB service selected from search
AWS DynamoDB create table screen showing table configuration options

Expected result: A DynamoDB table exists and is ready for Lambda to write items into it.

Step 3 - Create an AWS Lambda function to write to DynamoDB

Goal: Create a Lambda function that receives IoT Core event data and inserts it into your DynamoDB table.

What to do:

  1. In the AWS console, search Lambda and open the service.
  2. Select Create function, choose Author from scratch, and select Python for the runtime.
  3. Create the function, then scroll to lambda_function.py and replace the contents with the code below.
  4. After every change, click Deploy.
AWS console screenshot showing the Lambda service page
AWS Lambda create function screen with Author from scratch selected

Code:

import boto3
from decimal import Decimal
import logging

# Configure the logging module
logger = logging.getLogger()
logger.setLevel(logging.INFO)

dynamodb = boto3.resource('dynamodb')
table_name = 'MPU6050'
table = dynamodb.Table(table_name)

def lambda_handler(event, context):
    # Log information using the logging module
    logger.info("EVENT")

    ax = Decimal(event['value'])
    topic = event['topic']
    point_number = event['point_number']

    logger.info(point_number)

    # Write to DynamoDB
    table.put_item(
        Item={
            'point_number': point_number,
            'ax': ax,
            'topic': topic
        }
    )

    logger.info('Done Inputting')
AWS Lambda editor showing Python code in lambda_function.py

Expected result: Your Lambda function is created, updated with the provided Python code, and deployed.

Step 4 - Update IAM permissions for the Lambda role

Goal: Ensure the Lambda function role can write to your DynamoDB table.

What to do:

  1. In the AWS console, search IAM and open the service.
  2. Select the role that was created for your function (example shown: MPU6050-TO-DB-role).
  3. Select Add permissions and add the AdministratorAccess policy.

Note: The original tutorial uses AdministratorAccess for simplicity. Be cautious with this role in production.

AWS IAM console screenshot showing the IAM service
AWS IAM roles list showing the role created for the Lambda function
AWS IAM add permissions screen showing AdministratorAccess policy selection

Expected result: The Lambda role has permissions that allow it to write items to DynamoDB.

Step 5 - Configure AWS IoT Core (policy, Thing, certificates, and rule)

Goal: Create an IoT Core Thing and rule so MQTT messages from the Pico W trigger your Lambda function.

What to do:

  1. In the AWS console, search IoT Core and open the service.
  2. Create an IoT policy under Security > Policies (customize as needed).
  3. Create a Thing under All devices > Things > Create things.
  4. Auto-generate certificates for the Thing.
  5. Attach the policy you created to the Thing.
  6. Download the 4 files shown by AWS (device certificate, public key, private key, and Root CA 2048). You will upload these to the Pico W later.
  7. Copy and save your AWS IoT Core endpoint from Settings. Keep this information private.
  8. Create an IoT Rule under Message routing > Rules.
  9. Define the rule to select all values on the MPU6050/ax topic.
  10. Set the rule action to invoke the Lambda function you created.
AWS IoT Core console home screen
AWS IoT Core left navigation showing Policies under Security
AWS IoT policy editor showing example policy configuration
AWS IoT Core Things page showing create Thing flow
AWS IoT create Thing wizard showing Thing details
AWS IoT create Thing step showing naming the Thing
AWS IoT certificate creation step showing auto-generated certificates
AWS IoT attach policy step showing selection of IoT policy for a Thing
AWS IoT certificate download screen showing certificate and key files
AWS IoT settings page showing the IoT Core endpoint URL
AWS IoT Core message routing rules page
AWS IoT rule creation screen showing rule name
AWS IoT rule SQL statement selecting messages from MQTT topic MPU6050/ax
AWS IoT rule action selection showing Lambda function as the target

Expected result: An IoT Thing is created with certs, you have saved the IoT Core endpoint, and an IoT Rule routes messages on MPU6050/ax to your Lambda function.

Step 6 - Upload certificates and add the MQTT library on the Pico W

Goal: Prepare the Pico W filesystem with the required AWS IoT certificate files and the MicroPython MQTT client library.

What to do:

  1. Open your MicroPython editor (example used: Thonny) and upload the 4 downloaded AWS IoT files to the Pico W.
  2. Download the MicroPython umqtt library from: https://raw.githubusercontent.com/micropython/micropython-lib/master/micropython/umqtt.simple/umqtt/simple.py
  3. Create a file named simple.py inside your Pico W lib folder, and paste the library contents into that file.
Thonny file manager showing AWS IoT certificate and key files uploaded to Raspberry Pi Pico W
MicroPython project folder on Pico W showing lib directory with simple.py MQTT library file

Expected result: Your Pico W has the AWS IoT certificate/key files and the simple.py MQTT library available to import.

Step 7 - Run the MicroPython MQTT publisher and verify DynamoDB items

Goal: Connect the Pico W to Wi-Fi and AWS IoT Core, publish MPU6050 acceleration data, and confirm it lands in DynamoDB through Lambda.

What to do: Create a new MicroPython file on the Pico W (name it as you like) and use the code pattern below. This code connects to Wi-Fi, sets up TLS using your certificate files, reads MPU6050 acceleration, and publishes JSON payloads to the MPU6050/ax topic.

Code:

import json
import machine
import network
import ssl
import time
import ubinascii

from simple import MQTTClient
from imu import MPU6050
from machine import Pin, I2C
import config

SSID = config.SSID
WIFI_PASSWORD = config.WIFI_PASSWORD
MQTT_CLIENT_ID = ubinascii.hexlify(machine.unique_id())
MQTT_CLIENT_KEY = "6912b69415aa106cb16c0d8008df840cd4e584f0a273cb66eb98e3941108eb98-private.pem.key"
MQTT_CLIENT_CERT = "6912b69415aa106cb16c0d8008df840cd4e584f0a273cb66eb98e3941108eb98-certificate.pem.crt"
MQTT_BROKER = config.IOT_CORE_ENDPOINT
MQTT_BROKER_CA = "AmazonRootCA1.pem"

i2c = I2C(0, sda=Pin(0), scl=Pin(1), freq=400000)
imu = MPU6050(i2c)

def read_pem(file):
    with open(file, "r") as input:
        text = input.read().strip()
    split_text = text.split("\n")
    base64_text = "".join(split_text[1:-1])
    return ubinascii.a2b_base64(base64_text)

def connect_internet():
    try:
        sta_if = network.WLAN(network.STA_IF)
        sta_if.active(True)
        sta_if.connect(SSID, WIFI_PASSWORD)
        for i in range(0, 10):
            if not sta_if.isconnected():
                time.sleep(1)
        print("Connected to Wi-Fi")
    except Exception as e:
        print('There was an issue connecting to WIFI')
        print(e)

def publish_mpu_values(x):
    ax = round(imu.accel.x, 2)
    payload = {
        "point_number": x,
        "value": str(ax),
        "topic": 'MPU6050/ax'
    }
    mqtt_client.publish('MPU6050/ax', json.dumps(payload))

connect_internet()

key = read_pem(MQTT_CLIENT_KEY)
cert = read_pem(MQTT_CLIENT_CERT)
ca = read_pem(MQTT_BROKER_CA)

mqtt_client = MQTTClient(
    MQTT_CLIENT_ID,
    MQTT_BROKER,
    keepalive=60,
    ssl=True,
    ssl_params={
        "key": key,
        "cert": cert,
        "server_hostname": MQTT_BROKER,
        "cert_reqs": ssl.CERT_REQUIRED,
        "cadata": ca,
    },
)

print(f"Connecting to MQTT broker")
mqtt_client.connect()
print("Done Connecting, sending Values")

for i in range(1, 101):
    print("Publishing point ", i)
    publish_mpu_values(i)

After running the script, go back to DynamoDB and check your table under Explore items.

AWS DynamoDB Explore items screen showing a table selected for viewing records
AWS DynamoDB table items view showing sensor data records inserted by Lambda

Expected result: You see new items in your DynamoDB table for point_number and ax values, created by Lambda from MQTT messages coming from the Pico W.

Note: If the MicroPython code runs but you do not see items, the original tutorial recommends checking Lambda logs in CloudWatch.

Conclusion

You built a full cloud ingest pipeline that sends Raspberry Pi Pico W MPU6050 acceleration data to AWS DynamoDB using MQTT to AWS IoT Core, an IoT Rule, and a Python AWS Lambda function.

Want parts for your next IoT build? Shop at ShillehTek.com. If you want help adapting this architecture to your product or use case, check out our IoT consulting services.

If you enjoyed the original creator’s content, you can also visit: https://www.youtube.com/@mmshilleh/videos and https://www.buymeacoffee.com/mmshilleh.