MakerGram Logo

    MakerGram

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags
    • Users
    • Groups

    My White Lab Coat Journey - I still can’t believe that I made it!

    TinyML
    1
    1
    8
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • U
      udaydhama last edited by

      I wasn’t supposed to make it this far. No one in my family had ever been a doctor. No financial backing. No emotional support. Just long nights, borrowed books, and a stubborn belief that I could get there. For years, I worked in government hospitals—pulling double shifts, skipping meals, standing for hours, learning what it truly meant to care for people. And somewhere along the way, I stopped dreaming for myself. Until this year. After a decade of service, I finally signed the lease on my own clinic. It’s small, but it’s mine. The walls were painted, the nameplate was hung... and then came something I didn’t expect to matter so much: the lab coat. I wanted something that didn’t just look like a doctor’s coat, but felt like the journey I’d walked. That’s when I found Lintex. Their lab coats are crisp, structured, modern & comfortable enough for 12-hour days. I picked one with a perfect fit and my name embroidered on the chest. The first time I wore it in my new clinic, it didn’t just feel like a coat. It felt like arrival.

      1 Reply Last reply Reply Quote 0
      • First post
        Last post

      Recent Posts

      • U

        I wasn’t supposed to make it this far. No one in my family had ever been a doctor. No financial backing. No emotional support. Just long nights, borrowed books, and a stubborn belief that I could get there. For years, I worked in government hospitals—pulling double shifts, skipping meals, standing for hours, learning what it truly meant to care for people. And somewhere along the way, I stopped dreaming for myself. Until this year. After a decade of service, I finally signed the lease on my own clinic. It’s small, but it’s mine. The walls were painted, the nameplate was hung... and then came something I didn’t expect to matter so much: the lab coat. I wanted something that didn’t just look like a doctor’s coat, but felt like the journey I’d walked. That’s when I found Lintex. Their lab coats are crisp, structured, modern & comfortable enough for 12-hour days. I picked one with a perfect fit and my name embroidered on the chest. The first time I wore it in my new clinic, it didn’t just feel like a coat. It felt like arrival.

        • read more
      • Hi makers

        i have choosen project theme.

        project title : Still Heard

        “For the ones who stayed quiet but always cared.”

        Still Heard is an interactive emotional companion built for those who’ve ever felt unheard, unseen, or silenced especially in professional or personal spaces.

        looked for some option.
        need to take the input from the user and we have two options
        direct text from user
        or
        capturing emotion using sensors
        i have read about Grove-GSR sensor
        can we use that one?
        Is there any better options ?

        • read more
      • Hi @rahuljeyaraj ,

        Welcome to the MakerGram community forum, There are several capable development boards available right now for the application. I'm sharing the one we have at our MakerGram inventory,

        For small models, I recommend using - XIAO ESP32-S3 Sense along with Edge Impulse platform to create the model, Here the processing is managed by the esp32 Core itself. It maybe feel slower if the mode it bit heavy.

        next option for small models is Seeed Grove Vision v2 camera where all the processing will done in the inbuilt "Arm Cortex-M55 & Ethos-U55" and you can get the inference results via the I2C or UART from the board, later you can process with Arduino or any XIAO board itself.

        If the model is moderate in size, better to go with Raspberry Pi and v2 or v3 camera and use the Edge Impulse models.

        Currently, we have these board available.

        • read more
      • R

        Hi everyone,

        I’m planning a vision-based AI project where the AI model should run directly on the embedded board (not in the cloud). Based on the inference results, I want to trigger actions like controlling a light or playing a pre-recorded audio file through a speaker.

        Could you please suggest which board or boards would be suitable for this kind of edge AI application? Also, does MakerGram stock such boards?

        Thanks in advance for your help!

        • read more
      • @salmanfaris Thank you Salman, This solution worked for me.

        • read more
      By MakerGram | A XiStart Initiative | Built with ♥ NodeBB
      Copyright © 2023 MakerGram, All rights reserved.
      Privacy Policy | Terms & Conditions | Disclaimer | Code of Conduct