libcamera - Development of Open Source Embedded Linux Wildlife Camera System
Laurent Pinchart
laurent.pinchart at ideasonboard.com
Thu Jan 2 00:08:44 CET 2025
Hi Will,
On Tue, Dec 31, 2024 at 02:45:44PM +0000, w.robertson at cairnwater.com wrote:
> Hi Laurent, Jacopo and Kieran,
>
> All my best wishes to you for 2025!
>
> I put together this overview summary of CritterCam that hopefully
> explains how the engineering and economics of the situation make it
> feasible - let me know if it would be of any help to you as an example
> use case (it might be a good one because I'm not trying to sell anything
> or make a sales pitch but I am trying to take the speed and aggression
> of modern software engineering to an until-now far-too-sleepy area of
> electronics):
>
> https://bitbucket.org/WillRobertRobertson/crittercam/src/main/README.md
I went through the document. The project sounds very ambitious, I think
you will need to carefully plan intermediate steps on your way to the
end goal.
One point that you may have overlooked is the design of the camera
pipeline. Roughly speaking, to capture processed image, you can use
- A raw camera sensor connected to an SoC with an ISP (ideally hardware,
possibly software with GPU offload at the cost of higher power
consumption).
- A raw camera sensor connected to a standalone ISP, itself connected to
an SoC without an ISP (but with a camera capture interface, typically
a CSI-2 or parallel receiver). This increases the cost and power
consumption due to the external ISP. Wake up latencies can also be
affected.
- A camera sensor that integrates an ISP (a.k.a. smart sensor or YUV
sensor) connected to an SoC without an ISP (but with a camera capture
interface, typically a CSI-2 or parallel receiver).
Most (if not all) of the camera sensors you list are raw camera sensors,
and most of the SoCs you list either don't include an ISP (SAMA7G54,
R2/G2, AM625, STM32N6, RT1170) or have an ISP that is not supported yet
but upstream kernel drivers and by libcamera (AM67A, TDA4VM, Genio 510).
It isn't clear to me how you plan to address that issue.
Wake up latency is also something that will need to be considered very
carefully. If your power consumption requirements require powering down
the SoC and DRAM completely, the system will need to cold-boot when
woken up, which will probably take too long. Some workarounds are
possible but may require very extensive software development efforts,
and those workaround may not be portable across different SoCs.
One last (smaller) technical comment: you mention ACPI system states (S0
to S4), while none the the SoCs you list use ACPI.
Please see below for more comments.
> The in-depth analysis behind that is in separate files (far too long to
> include in the summary) that're in a private repository for now because
> they have email addresses, etc. (Think I'm developing a preference for
> GitHub over BitBucket...)
>
> Also translated Goedele's most recent article to English :-)
>
> https://new-homes-for-old-friends.cairnwater.com/carved-tree-cavities-for-endangered-dormice/
>
> All the best for 2025!
>
> Will
>
> On 2024-12-22 19:22, w.robertson at cairnwater.com wrote:
> > Hi Laurent, Jacopo and Kieran,
> >
> > I was trying to think of ways that I could help out. All my current
> > experience is server side Python/.Net/SQL ( www.faclair.com is one
> > project) and as a climbing arborist so I don't have any kernel
> > experience.
> >
> > I'm compiling a list of SoMs, SiPs and SBCs that should in theory work
> > with libcamera with relevant information from their technical data
> > sheets and questions in emails to their manufacturers - at the moment
> > this is in CritterCam's git repository but I could make this available
> > for other libcamera users and testers if this would be a help?
> >
> > I looked at using a μC with a MIPI CSI-2 interface.
> >
> > From a conservation perspective, the research has been very successful
> > but trying to fund it from my commercial work has been financially
> > disastrous - I'm moving out of my flat and into my van to save money in
> > the hope that enough funding to survive might become available next
> > year - if I tried to port libcamera to a μC I'd go bankrupt long before
> > the work could be completed.
> >
> > The need for CritterCam to outlast all currently available silicon -
> > keeping up with very rapidly evolving processor and sensor technology -
> > and to support multiple contributors contributing both software
> > features and add-on off-the-shelf and custom electronic modules to a
> > single open source repository also encouraged me to look at a μP and
> > embedded Linux instead of a μC.
> >
> > The only ways I can see at the moment of having any possibility of
> > getting a high quality camera working on a μC would be to work together
> > with Raspberry Pi or ST.
> >
> > A talk by Naushir Patuck from Raspberry Pi in November 2024 mentioned
> > that Raspberry Pi were considering the possibility of a μC with MIPI
> > CSI-2 support:
> >
> > https://www.digikey.com/en/blog/webinar-introduction-raspberry-pi-imaging-and-computer-vision-tools
> >
> > ST seem to have very recently unveiled the STM32N6x7 (datasheet v1
> > November 2024) and STM32N6x5 (datasheet link gives 404 error)
> > Cortex-M55 μC with MIPI CSI-2 support:
> >
> > https://www.st.com/en/microcontrollers-microprocessors/stm32n6-series.html
> >
> >> the i.MX8MP, for instance, has a Cortex M core
> >
> > The STM32MP25 also includes a Cortex M33 - so the i.MX8MP and STM32MP25
> > raise the theoretical possibility of having both a low-power μC
> > solution and a more power hungry but much more flexible μP solution
> > that both use the same silicon - avoiding the potentially costly
> > headache of CritterCam maintaining seperate low power μC and high
> > flexibility μP forks - Cortex M can't support CSI-2 so would be limited
> > to simpler camera modules.
Whether or not you can control the camera hardware (CSI-2 receiver, ISP,
DMA engines, ...) from a Cortex M core depends on the integration of
those peripherals in a particular SoC, and thus varies between SoC
models. It's not an intrinsic property of Cortex M that it can't support
CSI-2.
> >> I presume you would have more luck asking on the rpi forums.
> >
> > Every time the need for a low power sleep or suspend mode has been
> > raised on the RPi forums the answer has been a very decisive "No.
> > Switch it off and reboot." - this is a massive problem that renders RPi
> > SBCs effectively useless for any power sensitive application - it lead
> > me to look at silicon from ST and NXP instead. I keep hoping that
> > something might appear from RPi but nothing has.
> >
> >> Yes, to realize and tune a camera system you need knowledge of both
> > the ISP, the image sensor and the optics.
> >
> >> If a vendor instead provide tuning data pre-calculated for
> > a specific set of camera modules designed to work with their platforms
> > then yes, you would have a choice of pre-tuned cameras to pick from.
> >
> > Thinking about this - I used to work in the physics, chemistry and
> > nonlinear optics of optoelectronic materials - I don't know if this
> > could be any help or not - the thought went through my mind of
> > developing a small optical bench on which an image sensor and any
> > associated optics could be placed then red, green, blue and IR lasers
> > modulated and scanned over both it's surface and possibly a known
> > reference image sensor while a computer reads data from it so that
> > precise calibration and tuning data for that particular make and model
> > of image sensor and any associated optics could be determined and open
> > sourced?
Calibration and tuning for camera modules requires a test bench, but I
don't think lasers would help. See chapter 6 of
https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf
for an example of a tuning procedure and the corresponding tools and
environment.
> > I don't know if this would be of use or not (perhaps using
> > this data to train a simple NN could be a way of implementing
> > calibration and tuning algorithms if NPUs or GPUs become fast enough -
> > NPUs and GPUs might be good at handling the simple but highly parallel
> > tensor operations required without proprietory tie-in)?
> >
> > This might allow some flaws and limitations inherent in the
> > manufacturing of the sensor and optics to be compensated for through
> > firmware tuning and calibration?
Flaws such as defective pixels, or intrinsic physical properties of the
camera module that degrate the image quality (noise, lens shading, ...)
are typically corrected by the ISP, based on calibration and tuning
data.
> >> The implementation of
> > the algorithms, how they use and combine features of different IPs on
> > the platforms (gpu, other accelerators etc) and the tuning of the whole
> > camera pipeline is usually what SoC vendors compete on.
> >
> >> Sensor driver manufacturers are not involved when it comes to ISP
> > algorithms development. Knowledge of the ISP, its design and
> > documentation are usually only accessible to the SoC vendors (again
> > apart from RPi that fully documents their ISP publicly).
> >
> > Thank you ver much - it's a big help for me to understand that - so
> > basically one way that businesses like ST and NXP compete with each
> > other is by competing to provide the best ISP algorithms as well as the
> > best silicon?
> >
> >> The RPi pipeline handler implements a selection logic to pick the
> >> "best"
> > sensor configuration given a desired output streams configuration:
> > https://git.libcamera.org/libcamera/libcamera.git/tree/src/libcamera/pipeline/rpi/common/pipeline_base.cpp#n937
> >
> >> libcamera offers an API to override the pipeline handler choice, so I
> > would say your application is free to chose any configuration
> > offered by the sensor's driver.
> >
> > Thank you very much - that's an enormous help - I'd been confused by
> > that. Typically commercial wildlife cameras boast the number of pixels
> > - often including nonexistent "interpolated pixels" - whereas the big
> > limiting factor on the image quality is the signal to noise ratio on
> > those pixels so binning may be very valuable - particularly in low
> > light conditions or when processing power and storage are limited.
> >
> >> Some vendors
> > decide to distribute their "advanced" algorithms as binaries through
> > different channels and support their platforms in mainline libcamera
> > with
> > a reduced number of features in the source implementation.
> >
> > Why don't they contribute their "advanced" proprietory features as
> > obfuscated and compiled binary plugins for libcamera?
They can do that, and some vendors are looking at this for their newest
SoCs. I believe ST will have a closed-source plugin (we call those IPA
modules in libcamera, for Image Processing Algorithms) for the STM32MP2
(don't quote me on that though). There will be more vendors adopting
that approach.
> >> Maybe that's why I've never seen a flying squirrel in Nuuksio :-(
> >
> > The bad news is that their numbers have been dropping very rapidly. The
> > good news is that research into Pteromys volans by Ralf Wistbacka and
> > others and into Red Squirrels, Hazel Dormice and Garden Dormice by
> > Goedele Verbeylen, Thomas Briner and others has identified lack of
> > suitable nesting holes as a significant cause for this decline and that
> > we now have very fast, precise and scalable chainsaw and rotary carving
> > techniques to create nest holes for them. The other good news is that
> > Goedele Verbeylen and her colleagues have brought Garden Dormice back
> > from the brink of extinction in Belgium, proving that it can be done.
> >
> >> If you need to
> > operate with really low power, keeping the Cortex A cores and DRAM
> > powered, even in low-power mode, may go over your power budget.
> >
> > Yup. I'm keeping a close eye on reported power consumption for SoMs and
> > I'll measure power consumption. I'm also trying to keep everything
> > small to fit into a small amount of RAM and tentatively rejecting any
> > board that uses DDR4 instead of LPDDR4.
> >
> > The power available is limited by the amount of sun, size of solar
> > panels, size of rechargeable batteries and battery chemistry (Li and
> > LiFePO4 batteries work well in summer but only PbA batteries work below
> > a few °C). It may be that tiny wind generators could be added in
> > difficult conditions. Like any tree dwelling creature, CritterCam has
> > to be very light, flexible and adaptable to be successful.
> >
> > Simply by allowing the battery, solar panel and the μP to be in
> > seperate boxes, allowing videos and photos to be optionally compressed
> > and allowing WiFi or Ethernet to be switched on to download data,
> > CritterCam will remove the need for conservationists and researchers to
> > risk their lives on wobbly ladders changing batteries and SD cards.
Modular designs are always tempting, but they come at a cost. I would
advice, for the first version, focussing on the components where
modularity would bring the best added value, and keeping the other
components in a more monolithic design. An external solar panel seems to
make sense, and making the connection waterproof shouldn't be too
difficult. An external camera module, on the other hand, would be more
difficult, with (in my opinion) less added value. You could still make
the design within the main enclosure modular to let users pick among
different camera sensors, or decide what communication module(s) they
want to install.
> >
> > An energy and power meter I ordered should arrive in a week or two and
> > I'll start measuring how much power a solar panel can give in the
> > typical worst case of a forest floor.
> >
> > When things get really desperate an optional GSM module could be
> > switched on to send a simple SMS or MQTT message to ask someone to come
> > and change a low battery for a fully charged one and consider adding
> > more solar panels.
> >
> >> C-PHY is more recent, it can provide the same bandwidth with a lower
> >> number of signals, but isn't as widely supported (I expect that to
> >> improve over time).
> >
> > That's good to know - C-PHY's 3 level signalling must be giving big
> > headaches for silicon designers used to working with digital signals.
> > Good to know that for the time being D-PHY is the best supported.
> >
> >> CSI-3 was killed by Intel and Qualcomm before it reached the market
> >
> > That's good to know. One less thing to keep track of. CSI-1 and CSI-3
> > RIP.
> >
> > Will
> >
> > On 2024-12-22 10:30, Laurent Pinchart wrote:
> >> Hi Will,
> >>
> >> On Fri, Dec 20, 2024 at 11:04:09AM +0000, w.robertson at cairnwater.com
> >> wrote:
> >>> Hello Laurent and Kieran,
> >>>
> >>> > This sounds very interesting. We rarely get contacted by people working
> >>> > on projects related to animals or nature. I live in Finland, so I would
> >>> > be happy to help the flying squirrels from Nuuksio :-)
> >>>
> >>> That's wonderful! Dormouse species hibernate but the flying squirrels
> >>> can't hibernate and so must have nest holes which provide protection
> >>> from weather and allow them to build well insulated nests to survive
> >>> the
> >>> winter, they also ideally need nest holes with entrances which are
> >>> the
> >>> right size to let them in but keep the larger red squirrels out. Past
> >>> forestry practice was to remove dead trees and trees with holes
> >>> (which
> >>> were seen as unsightly or unstable) and this has left a drastic
> >>> shortage
> >>> of suitable nest holes for the flying squirrels.
> >>
> >> Maybe that's why I've never seen a flying squirrel in Nuuksio :-(
> >>
> >>> Most of my research is unpaid (with occasional small sponsorship here
> >>> and there) so resources have to be used very efficiently.
> >>>
> >>> By developing new rotary boring and precision chainsaw carving
> >>> techniques we can carve nest holes entirely through a single narrow,
> >>> very precise entrance with a much larger nesting chamber behind.
> >>>
> >>> > I'll try to help, in my (unfortunately limited) free time. The best
> >>> > option, if compatible with your needs, would be to have open
> >>> > discussions
> >>> > either on the libcamera development mailing list, or on the project's
> >>> > IRC channel (you can find contact information in
> >>> > https://libcamera.org/contributing.html). That way other developers
> >>> > will also be able to chime in.
> >>>
> >>> That would be wonderful!
> >>>
> >>> > libcamera doesn't currently support the STM32MP2, but ST is working on
> >>> > it. They showcased a working prototype at Embedded World in April this
> >>> > year, and I assume they will provide a version usable in production
> >>> > soon. While they haven't started upstreaming much of their code in
> >>> > libcamera, I believe they plan to do so. The platform is therefore
> >>> > interesting for low-power applications.
> >>>
> >>> Thank you very much - I'd been unsure about that - that's very
> >>> helpful
> >>> to know - I'll maybe try to find a way to give ST some gentle
> >>> encouragement in the spring - they're pushing aggressively to sell
> >>> both
> >>> their image sensors and STM32MP2 so libcamera support should be a
> >>> high
> >>> strategic priority for them. I get the feeling that having launched
> >>> STM32MP2 into both industrial control and domestic appliance markets
> >>> they've got a fairly heavy workload at the moment and the
> >>> STM32MP257F-DK
> >>> dev. board is due to be launched later than planned.
> >>>
> >>> > The i.MX8MP is much better supported in libcamera at the moment, and we
> >>> > are actively improving its support. There are low-cost development
> >>> > boards, such as the Debix Model A that we use for development (note that
> >>> > its camera connector is not standard, but they have a Debix I/O board
> >>> > that can interface the Model A camera connector to a Raspberry Pi 15
> >>> > pins connector).
> >>>
> >>> Thank you very much! That's wonderful to have a good dev. board! I'd
> >>> been looking at i.MX8MP SoMs and SiPs but some of them still seem to
> >>> be
> >>> in the design stage with some technical information missing from data
> >>> sheets. Will Debix Model A's LPDDR4 allow it to quickly enter and
> >>> waken
> >>> from a low power sleep mode?
> >>
> >> I haven't experienced with low power sleep and wake up time myself, so
> >> I
> >> can't tell.
> >>
> >> Generally speaking, it all depends on your power budget. If you need
> >> to
> >> operate with really low power, keeping the Cortex A cores and DRAM
> >> powered, even in low-power mode, may go over your power budget. Other
> >> options are possible, such as turning power to the Cortex A and DRAM
> >> off, and handling fast wake up tasks in a smaller core (the i.MX8MP,
> >> for
> >> instance, has a Cortex M core that could be used for this). The wake
> >> up
> >> time gets significantly increased, as Linux will need to boot from
> >> scratch. You may be able to start powering up the camera sensor in
> >> parallel from the Cortex M core to save some time, or even drive the
> >> camera completely from the Cortex M, but that would require porting
> >> libcamera (and drivers) to whatever OS you would be running there,
> >> which
> >> would require really significant effort.
> >>
> >>> The 4 lane MIPI CSI seems potentially more
> >>> flexible than the STM32MP2's 2 lane CSI-2 interface.
> >>
> >> 4 lanes will give you more bandwidth, but unless you target really
> >> high
> >> resolutions at high frame rates, that shouldn't be necessary.
> >>
> >>> I guess the best way for me to get a Debix Model A or B is just to
> >>> order
> >>> it from RS?
> >>
> >> I believe so.
> >>
> >>> (One thing that's confusing me - when modern data sheets like the
> >>> Debix
> >>> Model A data sheet say "MIPI CSI 4-lane" I guess they mean "MIPI
> >>> CSI-2"?
> >>> I'd heard of CSI-1 which supports only 1 lane and seems old and
> >>> rarely
> >>> used now, of CSI-2 which supports multiple lanes and theoretically a
> >>> choice of C-PHY of D-PHY physical layers (though I guess many CSI-2
> >>> implementations maybe don't implement C-PHY because of its
> >>> complexity?)
> >>> for the physical layer and CSI-3 which is based on UniPro and rarely
> >>> implemented at the moment.)
> >>
> >> Today the terms "MIPI CSI" (or sometimes just "MIPI camera") refers to
> >> CSI-2. CSI-1 has retired a very long time ago, and CSI-3 was killed by
> >> Intel and Qualcomm before it reached the market (there were some
> >> prototype implementations, but nothing in mass production).
> >>
> >> D-PHY is the most common PHY for CSI-2, and should be supported
> >> everywhere. C-PHY is more recent, it can provide the same bandwidth
> >> with
> >> a lower number of signals, but isn't as widely supported (I expect
> >> that
> >> to improve over time). I don't think the choice of PHY matters too
> >> much
> >> in your case, whatever is supported by both the camera sensor and the
> >> SoC will be fine.
> >>
> >>> Thank you very much for all your help!
> >>>
> >>> Will
> >>>
> >>> On 2024-12-19 23:17, Laurent Pinchart wrote:
> >>> > Hello Will,
> >>> >
> >>> > (CC'ing my colleague Kieran)
> >>> >
> >>> > On Thu, Dec 19, 2024 at 07:46:53PM +0000, w.robertson at cairnwater.com
> >>> > wrote:
> >>> >> Hi Laurent
> >>> >>
> >>> >> Thank you very much for your video "Giving Linux a Camera Stack:
> >>> >> libcamera's 3 Years Journey and Exciting Future" and the wonderful
> >>> >> work
> >>> >> of you and the libcamera developers in bringing such a powerful and
> >>> >> structured approach to such an extremely difficult and complex
> >>> >> problem.
> >>> >
> >>> > You're more than welcome.
> >>> >
> >>> >> I work as a climbing arborist in practice and research and software
> >>> >> engineer. My research is mostly voluntary and focused on new minimally
> >>> >> invasive techniques to carve critically needed nest holes for
> >>> >> endangered
> >>> >> tree dwelling dormouse species (Hazel Dormouse, Garden Dormouse,
> >>> >> Forest
> >>> >> Dormouse), bat species and in Finland the European Flying Squirrel
> >>> >> (Pteromys volans).
> >>> >>
> >>> >> To overcome problems and limitations with commercial wildlife cameras
> >>> >> for small mammals I started work on CritterCam which - if I can get it
> >>> >> to work - will be a fully open source wildlife camera system which is
> >>> >> vastly better, vastly more flexible, vastly more independent in the
> >>> >> field and significantly cheaper than anything commercially available
> >>> >> at
> >>> >> the moment. Working with embedded Linux and with several different
> >>> >> manufacturers for the microprocessor and image sensors for CritterCam
> >>> >> means that CritterCam users can never be price gouged and CritterCam
> >>> >> will be able to adapt to update all of its critical components with
> >>> >> newer alternatives well into the future.
> >>> >>
> >>> >> This is an article with some videos of Garden Dormice exploring our
> >>> >> carved nest holes published in Flemish yesterday by Goedele Verbeylen:
> >>> >>
> >>> >> https://www.natuurpunt.be/nieuws/boomholtes-kerven-voor-bedreigde-eikelmuizen
> >>> >>
> >>> >> This is a short summary that I wrote in English and German:
> >>> >>
> >>> >> https://drive.google.com/file/d/1jkFA-PgDr-O9-nGSgODCUFkukj63F4fc/view?usp=sharing
> >>> >>
> >>> >> We have a small website at:
> >>> >>
> >>> >> new-homes-for-old-friends.cairnwater.com
> >>> >
> >>> > This sounds very interesting. We rarely get contacted by people working
> >>> > on projects related to animals or nature. I live in Finland, so I would
> >>> > be happy to help the flying squirrels from Nuuksio :-)
> >>> >
> >>> >> My background is mainly server side so I don't have kernel experience
> >>> >> and I was wondering if you or another libcamera might be willing to
> >>> >> give
> >>> >> me any pointers to help me get started using supported MIPI CSI-2
> >>> >> cameras with embedded Linux or ANdroid?
> >>> >
> >>> > I'll try to help, in my (unfortunately limited) free time. The best
> >>> > option, if compatible with your needs, would be to have open
> >>> > discussions
> >>> > either on the libcamera development mailing list, or on the project's
> >>> > IRC channel (you can find contact information in
> >>> > https://libcamera.org/contributing.html). That way other developers
> >>> > will
> >>> > also be able to chime in.
> >>> >
> >>> >> I'm hoping to build a first prototype using the STM32MP257F-DK dev.
> >>> >> board based on the STM32MP257 μP when it becomes available in January
> >>> >> but I'm also looking at SoMs and SiPs based on other μPs like the NXP
> >>> >> i.MX 8M Plus that support a CSI-2 interface and a low power sleep mode
> >>> >> -
> >>> >> I can also do some initial testing using a Raspberry Pi SBC.
> >>> >
> >>> > libcamera doesn't currently support the STM32MP2, but ST is working on
> >>> > it. They showcased a working prototype at Embedded World in April this
> >>> > year, and I assume they will provide a version usable in production
> >>> > soon. While they haven't started upstreaming much of their code in
> >>> > libcamera, I believe they plan to do so. The platform is therefore
> >>> > interesting for low-power applications.
> >>> >
> >>> > The i.MX8MP is much better supported in libcamera at the moment, and we
> >>> > are actively improving its support. There are low-cost development
> >>> > boards, such as the Debix Model A that we use for development (note
> >>> > that
> >>> > its camera connector is not standard, but they have a Debix I/O board
> >>> > that can interface the Model A camera connector to a Raspberry Pi 15
> >>> > pins connector).
> >>> >
> >>> > Raspberry Pi platforms are also well supported, with the work being
> >>> > performed by Raspberry Pi.
--
Regards,
Laurent Pinchart
More information about the libcamera-devel
mailing list