libcamera - Development of Open Source Embedded Linux Wildlife Camera System

Laurent Pinchart laurent.pinchart at ideasonboard.com
Tue Jan 7 14:56:46 CET 2025


Hi Will,

On Tue, Jan 07, 2025 at 12:23:36PM +0000, w.robertson at cairnwater.com wrote:
> Hi Laurent,
> 
> Thank you very much.
> 
> > Roughly speaking, AEGC (auto-exposure and gain control) involves
> > computing statistics on the image (it can be as simple as measuring the
> > average pixel intensity on a small region of interest in software, but
> > to obtain good results more complex types of statistics are needed, such
> > as dividing the image in a grid and computing averages and histograms
> > for each grid element), and using those statistics to compute the
> > exposure time and analog gain for the next frame. The process needs to
> > be repeated for every frame. Some sensors implement AEGC internally,
> > which could be useful for your use cases. Otherwise, statistics are
> > computed by the ISP, and libcamera runs an AEGC algorithm to process the
> > statistics, compute the exposure time and gain, and apply them to the
> > sensor.
> 
> > Note that the AECG algorithm will take several frames to converge, which
> > means that the first few frames captured from the camera won't be
> > properly exposed. Convergence may be faster when AEGC is implemented by
> > the sensor (no guarantee though, it would need to be tested).
> 
> With wildlife cameras it may be possible to use tricks like a photodiode 
> or photodiode array and ADC used as a light level meter for AEGC - that 
> wouldn't work so well for other applications but for wildlife cameras it 
> might.

That's one of the things you would be able to experiment with more
easily with libcamera compared to closed-source camera stacks :-)

> It could be that NPUs can do this sort of task more efficiently 
> than CPUs - it might be possible to implement AEGC as a simple neural 
> network running on an NPU - I'm not sure.

Possibly. Training an NN model for this may be lots of work though.
The traditional AEGC algorithm approach seems less costly to me.

> > It would be interesting to port libcamera to those smaller SoCs, running
> > real time operating systems, but it's a large project.
> 
> How would this be done? Would it be a matter of finding one funder or a 
> consortium of funders to fund the work or of starting with a subset of 
> the simplest functionality and progressing from there?

It first requires finding someone, or a set of people, with an interest
to make this happen. Those people could contribute time by performing
the work, our budget to outsource it.

> One of the problems might be choosing an RTOS - different players seem 
> to be favouring different RTOSs so I became cautious about investing too 
> much in any one RTOS.
> 
> What's the situation like with Rockchip SoCs? I've heard different 
> conflicting things about them recently so I became cautious.

libcamera supports the RK3399, which is getting quite old now (but may
still be an interesting platform to consider). Newer Rockchip SoCs are
not supported yet. I know that some people expressed interest in working
on them, but no code has been contributed yet. I'm am not aware of any
schedule.

> On 2025-01-06 23:21, Laurent Pinchart wrote:
> > On Mon, Jan 06, 2025 at 04:41:03PM +0000, w.robertson at cairnwater.com  wrote:
> >> Hi Laurent,
> >> 
> >> > That looks fine.
> >> 
> >> That's great - that's an enormous help to know that I'm on the right
> >> track. The DEBIX Model A SBC (NXP i.MX 8M Plus) and DEBIX I/O Board
> >> arrived :-)
> >> 
> >> > I have no experience with those chips, so I can't comment on this.
> >> 
> >> I think hardly anyone has experience with them - the idea of a μC with
> >> CSI-2 and ISP so new - STM32N6 was only launched in December.
> >> 
> >> > I'm not sure what they expect their customers to use for the time being,
> >> > I believe (and hope) libcamera is the solution they will push forward.
> >> 
> >> Thanks - I'll try to see if I can give them some encouragement re.
> >> libcamera gently.
> >> 
> >> > It's theoretically possible, assuming again that auto-exposure is
> >> > implemented on the device. That, and auto-focus if needed, are the
> >> > only parts that can't be implemented in post-processing. You would
> >> > however need to store the data uncompressed, which will require a large
> >> > amount of storage, and high bandwidth to the storage medium.
> >> 
> >> Auto-exposure is something that I don't know about - some image 
> >> sensors
> >> seem to claim that they implement it and others don't.
> > 
> > Roughly speaking, AEGC (auto-exposure and gain control) involves
> > computing statistics on the image (it can be as simple as measuring the
> > average pixel intensity on a small region of interest in software, but
> > to obtain good results more complex types of statistics are needed, such
> > as dividing the image in a grid and computing averages and histograms
> > for each grid element), and using those statistics to compute the
> > exposure time and analog gain for the next frame. The process needs to
> > be repeated for every frame. Some sensors implement AEGC internally,
> > which could be useful for your use cases. Otherwise, statistics are
> > computed by the ISP, and libcamera runs an AEGC algorithm to process the
> > statistics, compute the exposure time and gain, and apply them to the
> > sensor.
> > 
> > Note that the AECG algorithm will take several frames to converge, which
> > means that the first few frames captured from the camera won't be
> > properly exposed. Convergence may be faster when AEGC is implemented by
> > the sensor (no guarantee though, it would need to be tested).
> > 
> >> Usually autofocus isn't present on wildlife cameras - in commercial
> >> wildlife cameras the focus is fixed in manufacture and can't be changed
> >> without dismantling the wildlife camera - including watertight seals -
> >> or by using glue or blue-tack (I'm not joking!) to attach simple +/- 1,
> >> 2 or 3 dioptre lenses in front of the wildlife camera.
> >> 
> >> Just by offering the ability to set focus by manually adjusting the lens
> >> or set focus in software configuration CritterCam would be a big step
> >> forward - autofocus (for example with the RPi Camera v3) would be an
> >> optional nice-to-have.
> >> 
> >> The high bandwidth to the storage medium might be a problem - I'm not
> >> sure what the limits are for that. Commercial wildlife cameras seem to
> >> vomit enormous uncompressed photo and video files onto the SD card and
> >> leave them there.
> >> 
> >> This is out of scope for libcamera: The STM32N6 Series claims "MIPI
> >> CSI-2 interface and image signal processing (ISP)" and "an H264 hardware
> >> encoder" and the NXP S32V claims "Embedded image sensor processing (ISP)
> >> for HDR, color conversion, tone mapping, etc.", " Image signal processor
> >> (ISP), supporting 2x1 or 1x2 megapixel @ 30 fps and 4x2 megapixel for
> >> subset of functions (exposure control, gamma correction)" and "H.264
> >> video encode" so it looks like ISP and compression and encoding are
> >> there on some μCs in theory but the technology seems very new and the
> >> number of pixels and frame rate supported is limited.
> > 
> > It would be interesting to port libcamera to those smaller SoCs, running
> > real time operating systems, but it's a large project.
> > 
> >> On 2025-01-06 08:20, Laurent Pinchart wrote:
> >> > Hi Will,
> >> >
> >> > On Fri, Jan 03, 2025 at 09:52:13PM +0000, w.robertson at cairnwater.com
> >> > wrote:
> >> >> Hi Laurent,
> >> >>
> >> >> Thank you very much. I've redrafted the README.md file to get rid of the
> >> >> largely irrelevant information about CPU cores on SoCs and include the
> >> >> much more relevant information about ISP instead. Also reduced down the
> >> >> list of SoCs to SoCs with current or planned libcamera support. In the
> >> >> low power category I got rid of the NXP i.MX RT1170 and replaced it with
> >> >> the S32V2 which has an ISP. Let me know if I got it right and I'll
> >> >> redraft the 6. Development & Prototyping section:
> >> >>
> >> >> - **Candidate Processors**
> >> >> 	- **High Quality, High Flexibility**
> >> >> 		- **Supported by libcamera**
> >> >> 			- **ST STM32MP2 Series** CSI-2 #1 (5 Mpixels @30 fps with ISP), #2 (1
> >> >> Mpixels @15 fps no ISP)
> >> >> 			- **NXP i.MX 8M Plus** Dual ISPs: up to 12MP and 375MPixels/s
> >> >> 			- **NXP i.MX95** Preproduction: 500 Mpixel/s MIPI-CSI and ISP (2x
> >> >> 4-lane, 2.5 Gbps/lane)
> >> >> 			- **Chips Integrating Mali-C55 ISP**
> >> >> 				- **Renesas RZ/V2H** Mali-C55 ISPC
> >> >> 			- Chips Integrating **Mali C52 ISP**
> >> >>     				- Possibly in the longer therm
> >> >> 			- **Chips Integrating Amlogic C3 ISP**
> >> >>     				- Currently in development by Amlogic
> >> >
> >> > That looks fine.
> >> >
> >> >> 	- **Low Power**
> >> >> 		- **ST STM32N6 Series μCs** ISP, MIPI CSI-2, H264 hardware
> >> >> 		- **NXP S32V2** Embedded ISP for HDR, color conversion, tone mapping,
> >> >> etc. enabled by ISP graph tool
> >> >
> >> > I have no experience with those chips, so I can't comment on this.
> >> >
> >> >> > I think that would be a good first target. Using a raw sensor with an
> >> >> > SoC that integrates an ISP gives you a very wide range of usable camera
> >> >> > sensors, with a wide range of prices and performances. It is the most
> >> >> > flexible solution, especially if you want a modular system that allows
> >> >> > users to pick a sensor based on their needs and budget. Adding support
> >> >> > for a new sensor is relative easy. The downside is the limited choice of
> >> >> > SoCs, and the cost of bringing up a new ISP. As libcamera evolves, you
> >> >> > can expect support for more ISPs and SoCs, so the situation will improve
> >> >> > over time.
> >> >>
> >> >> Thank you very much - that's an enormous help - I'd got mixed up and had
> >> >> thought that the big cost was in the image sensor driver software not in
> >> >> the software for the ISP - it's an enormous help to me to know that the
> >> >> software for the ISP is the expensive part.
> >> >>
> >> >> Commercial wildlife camera manufacturers often seem to try to make money
> >> >> by price gouging researchers and conservationists but between NXP and ST
> >> >> there seems to be healthy competition and they seem to try to aim to
> >> >> make money through excellence and by winning market share in large,
> >> >> highly competitive and fast-growing markets - they've no interest in
> >> >> trying to price gouge conservationists.
> >> >>
> >> >> All the real intellectual heavy lifting in CritterCam is done by the
> >> >> libcamera team and open source so safe into the future.
> >> >>
> >> >> I don't have the specialist experience or funding needed to design a
> >> >> board for a modern SoC but prototyping using an evaluation board for a
> >> >> SoM or SBC then designing a carrier board for a SoM or SBC seems
> >> >> feasible. Maybe if CritterCam becomes widely used someone could find
> >> >> separate funding to reduce production costs by designing a board based
> >> >> on a reference design for a SoC without a SoM but that would be some
> >> >> time in the future.
> >> >>
> >> >> > Using a smart sensor with a simpler SoC gives you a much wider range of
> >> >> > SoCs to choose from, and a lower price point, and possibly with better
> >> >> > low power capabilities. The drawback is the limited choice of camera
> >> >> > sensors. If you want to provide very very low power consumption on some
> >> >> > models, this is an architecture that may still be worth investigating.
> >> >> >
> >> >> > In any case, I would recommend picking one of the two options to start
> >> >> > with, and exploring the second option later as an extension to the
> >> >> > product range if needed.
> >> >>
> >> >> It's difficult. I think fundamentally solar panels are becoming fairly
> >> >> cheap now but a software engineer's time is very expensive so there's an
> >> >> economic case for having the flexibility of an OS there and keeping
> >> >> firmware development costs to a minimum even if it comes at the price of
> >> >> relatively higher power consumption and relatively large solar panels.
> >> >>
> >> >> I kind-of get the feeling that a highly power efficient μC solution
> >> >> might be something to look at separately in the future if funding
> >> >> becomes available - STM32N6 and NXP S32V2 were the only μCs I could find
> >> >> with an ISP and STM32N6 was only launched in mid-December so the
> >> >> technology seems very new.
> >> >>
> >> >> > Yes, the list of pipeline handlers is the best place to check. STM32MP2
> >> >> > support will come (I don't know ST's roadmap, so I can't tell when).
> >> >>
> >> >> Do ST expect customers to use the V4L2 and Media Controller APIs for the
> >> >> time being until libcamera support is implemented?
> >> >
> >> > I'm not sure what they expect their customers to use for the time being,
> >> > I believe (and hope) libcamera is the solution they will push forward.
> >> >
> >> >> > Amlogic is working on support for their C3 ISP
> >> >>
> >> >> Is there a way of working out which of the Amlogic chips use their C3
> >> >> ISP or do all Amlogic chips with an ISP use either their C3 ISP or an
> >> >> ARM Mali ISP? (I was trying to work out if the Amlogic C308X's ISP is a
> >> >> C3 ISP but I couldn't work it out 🙈)
> >> >
> >> > Kieran, do you have more information about this ?
> >> >
> >> >> > There are V2H versions without the C55 ISP but with a simpler "ISP
> >> >> > lite" processing pipeline that could also be considered. They should be
> >> >> > cheaper, may consume less power, and could still provide acceptable
> >> >> > image quality.
> >> >>
> >> >> I don't think much about minimising the cost of the SoC - I reckon a lot
> >> >> of the modern SoCs with a camera pipeline offer exceptional value for
> >> >> money - there seems to be a big price increase and increase in power
> >> >> consumption between the SoC and the SoM or SBC so I tend to worry more
> >> >> about the cost and power consumption of the SoM or SBC. I tend to worry
> >> >> about power consumption in sleep mode a lot more than power consumption
> >> >> in active mode because wildlife cameras only tend to spend a very small
> >> >> proportion of their time in an active state.
> >> >>
> >> >> > Note that, to use an SoC without an ISP, you will need a full ISP in the
> >> >> > camera sensor. As far as I can tell from the data brief, the ISP in the
> >> >> > VD1940 doesn't implement a full processing pipeline, and still outputs
> >> >> > bayer data.
> >> >>
> >> >> Thanks - I should be cautious of that.
> >> >>
> >> >> > That being said, it also depends what you want to record. If your goal
> >> >> > is to capture still images, you could possibly get away without a real
> >> >> > ISP and use software processing on still images. You would still need to
> >> >> > implement AGC and AWB, and some SoCs (in the i.MX7 family for instance)
> >> >> > include a statistics computation engine that should be enough to do so.
> >> >> > Some camera sensors also implement AGC and AWB in the sensor itself,
> >> >> > albeit with less control on the algorithm, so it may or may not work for
> >> >> > your use cases.
> >> >>
> >> >> Goedele Verbeylen strongly prefers videos to still images.
> >> >>
> >> >> Would it be possible to stream raw data from an image sensor to a raw
> >> >> data file in real time then for ISP tasks to be carried out on the raw
> >> >> data at a later date when handling data in real time isn't important
> >> >> (this could be done after download to a more powerful computer or by a
> >> >> μC core gradually during the day when solar energy is plentiful)?
> >> >
> >> > It's theoretically possible, assuming again that auto-exposure is
> >> > implemented on the device. That, and auto-focus if needed, are the
> >> > only parts that can't be implemented in post-processing. You would
> >> > however need to store the data uncompressed, which will require a large
> >> > amount of storage, and high bandwidth to the storage medium.
> >> >
> >> >> On 2025-01-02 22:55, Laurent Pinchart wrote:
> >> >> > On Thu, Jan 02, 2025 at 11:53:07AM +0000, w.robertson at cairnwater.com wrote:
> >> >> >> Hi Laurent,
> >> >> >>
> >> >> >> Yes - I'm very worried about the ambitiousness of it and the risk of
> >> >> >> scope creep wrecking the project - I was thinking about limiting the
> >> >> >> scope to SoCs and image sensors supported by libcamera - do you think
> >> >> >> that would work? Anyone wanting a SoC or image sensor added that isn't
> >> >> >> supported by libcamera would have to estimate feasibility and cost for
> >> >> >> that separately.
> >> >> >
> >> >> > I think that would be a good first target. Using a raw sensor with an
> >> >> > SoC that integrates an ISP gives you a very wide range of usable camera
> >> >> > sensors, with a wide range of prices and performances. It is the most
> >> >> > flexible solution, especially if you want a modular system that allows
> >> >> > users to pick a sensor based on their needs and budget. Adding support
> >> >> > for a new sensor is relative easy. The downside is the limited choice of
> >> >> > SoCs, and the cost of bringing up a new ISP. As libcamera evolves, you
> >> >> > can expect support for more ISPs and SoCs, so the situation will improve
> >> >> > over time.
> >> >> >
> >> >> > Using a smart sensor with a simpler SoC gives you a much wider range of
> >> >> > SoCs to choose from, and a lower price point, and possibly with better
> >> >> > low power capabilities. The drawback is the limited choice of camera
> >> >> > sensors. If you want to provide very very low power consumption on some
> >> >> > models, this is an architecture that may still be worth investigating.
> >> >> >
> >> >> > In any case, I would recommend picking one of the two options to start
> >> >> > with, and exploring the second option later as an extension to the
> >> >> > product range if needed.
> >> >> >
> >> >> >> > Most (if not all) of the camera sensors you list are raw camera sensors,
> >> >> >> > and most of the SoCs you list either don't include an ISP (SAMA7G54,
> >> >> >> > R2/G2, AM625, STM32N6, RT1170) or have an ISP that is not supported yet
> >> >> >> > but upstream kernel drivers and by libcamera (AM67A, TDA4VM, Genio 510).
> >> >> >> > It isn't clear to me how you plan to address that issue.
> >> >> >>
> >> >> >> Thank you very much. That's an extremely important point.
> >> >> >>
> >> >> >> What's the best way to work out which SoCs are supported by libcamera?
> >> >> >> Is it best to look in the git repository here?:
> >> >> >>
> >> >> >> https://git.libcamera.org/libcamera/libcamera.git/tree/src/libcamera/pipeline
> >> >> >>
> >> >> >> with NXP i.MX8M Plus SoC having full support and that ST planning to add
> >> >> >> full support for STM32MP2 Series during 2025 ( currently partial support
> >> >> >> https://wiki.st.com/stm32mpu/wiki/How_to_use_libcamera )?
> >> >> >
> >> >> > Yes, the list of pipeline handlers is the best place to check. STM32MP2
> >> >> > support will come (I don't know ST's roadmap, so I can't tell when).
> >> >> > There's work underway by NXP on i.MX95 support too. We are working on
> >> >> > the Mali C55 ISP, which is integrated in the Renesas RZ/V2H. As a
> >> >> > stretch goal we want to support the C52, found in at least some Amlogic
> >> >> > SoCs, but that's more of a spare time project at this point. Amlogic is
> >> >> > working on support for their C3 ISP (see
> >> >> > https://lore.kernel.org/r/20241227-c3isp-v5-0-c7124e762ff6@amlogic.com
> >> >> > and
> >> >> > https://patchwork.libcamera.org/project/libcamera/list/?series=4926), I
> >> >> > don't know when it will be ready for production. There will be more.
> >> >> >
> >> >> >> I think it would be an enormous help for CritterCam to focus only on
> >> >> >> SoCs and image sensors supported by libcamera with any SoCs and image
> >> >> >> sensors that aren't supported by libcamera being treated as optional
> >> >> >> add-on functionality for which costs and feasibility would have to be
> >> >> >> estimated separately.
> >> >> >>
> >> >> >> If I abandon all the following because they do not have a hardware ISP:
> >> >> >>
> >> >> >> SAMA7G54
> >> >> >> i.MX RT1170
> >> >> >> Renesas RZ/G2
> >> >> >>
> >> >> >> However Renesas RZ/G2 could be replaced by Renesas RZ/V2H which has an
> >> >> >> ARM Mali-C55 ISP since libcamera is working on Mali-C55 support
> >> >> >> https://www.renesas.com/en/software-tool/rzv2h-isp-support-package
> >> >> >> https://libcamera.org/entries/2024-01-31.html
> >> >> >
> >> >> > Indeed. I should have read your mail fully before writing the text above
> >> >> > :-) There are V2H versions without the C55 ISP but with a simpler "ISP
> >> >> > lite" processing pipeline that could also be considered. They should be
> >> >> > cheaper, may consume less power, and could still provide acceptable
> >> >> > image quality.
> >> >> >
> >> >> >> AM67A, TDA4VM and Genio 510 would all be put out of scope until they
> >> >> >> support libcamera.
> >> >> >
> >> >> > Those are SoCs we would really like to support.
> >> >> >
> >> >> >> STM32N6 can't run Linux but does have a hardware ISP "A dedicated
> >> >> >> computer vision pipeline with a MIPI CSI-2 interface and image signal
> >> >> >> processing (ISP) ensures compatibility with a wide range of cameras. The
> >> >> >> STM32N6 also features an H264 hardware encoder".
> >> >> >>
> >> >> >> I put together a list of the ST BrightSense image sensors and whether
> >> >> >> they do or do not have an integrated ISP or similar (Google Sheets
> >> >> >> https://docs.google.com/spreadsheets/d/1S1-Ji6rsUPARtQFwsfhblWhzQ-97r-zk/edit?usp=sharing&ouid=102590517794814819664&rtpof=true&sd=true
> >> >> >> or tab separated below):
> >> >> >>
> >> >> >> VG5761, VG5661, VD5761, VD5661 data sheet
> >> >> >> Block diagram of Vx5y61 Shows ISP however web pages don't all seem to
> >> >> >> reflect this
> >> >> >>
> >> >> >> VB56G4A	No ISP	Embedded autoexposure
> >> >> >> VD16GZ	No ISP	in-sensor AE and various corrections.
> >> >> >> VD1940	On-chip bayerization ISP
> >> >> >
> >> >> > Note that, to use an SoC without an ISP, you will need a full ISP in the
> >> >> > camera sensor. As far as I can tell from the data brief, the ISP in the
> >> >> > VD1940 doesn't implement a full processing pipeline, and still outputs
> >> >> > bayer data.
> >> >> >
> >> >> > That being said, it also depends what you want to record. If your goal
> >> >> > is to capture still images, you could possibly get away without a real
> >> >> > ISP and use software processing on still images. You would still need to
> >> >> > implement AGC and AWB, and some SoCs (in the i.MX7 family for instance)
> >> >> > include a statistics computation engine that should be enough to do so.
> >> >> > Some camera sensors also implement AGC and AWB in the sensor itself,
> >> >> > albeit with less control on the algorithm, so it may or may not work for
> >> >> > your use cases.
> >> >> >
> >> >> > Lots of options...
> >> >> >
> >> >> >> VD55G0	No ISP	Embedded auto-exposure
> >> >> >> VD55G1	No ISP	Various on-chip features to optimize image quality, such
> >> >> >> as autoexposure, noise reduction, or defect correction
> >> >> >> VD5661	No ISP	Embedded 16-bit video processing pipe with pixel defect
> >> >> >> correction, high dynamic range (HDR) merge with ghost removal, and
> >> >> >> programmable compression of dynamic
> >> >> >> VD56G3	No ISP	in-sensor autoexposure and various corrections.
> >> >> >> VD5761	No ISP	Embedded 16-bit video processing pipe with pixel defect
> >> >> >> correction, high dynamic range (HDR) merge with ghost removal, and
> >> >> >> programmable compression of dynamic
> >> >> >> VD66GY	No ISP	in-sensor autoexposure and various corrections.
> >> >> >> VG5761	No ISP	Embedded 16-bit video processing pipe with pixel defect
> >> >> >> correction, high dynamic range (HDR) merge with ghost removal, and
> >> >> >> programmable compression of dynamic
> >> >> >> VG6640	No ISP	An embedded Bayer and monochrome data pre-processor
> >> >> >> VB1740	On-chip bayerization ISP
> >> >> >> VB1940	On-chip bayerization ISP
> >> >> >>
> >> >> >> For Sony I couldn't find the data but as far as I could work out the
> >> >> >> IMX708 doesn't have an integrated ISP.
> >> >> >>
> >> >> >> > Wake up latency is also something that will need to be considered very
> >> >> >> > carefully. If your power consumption requirements require powering down
> >> >> >> > the SoC and DRAM completely, the system will need to cold-boot when
> >> >> >> > woken up, which will probably take too long. Some workarounds are
> >> >> >> > possible but may require very extensive software development efforts,
> >> >> >> > and those workaround may not be portable across different SoCs.
> >> >> >>
> >> >> >> Power consumption is a major factor - at the moment I'm looking at using
> >> >> >> a 120 W nominal solar panel - guessing that real output would be 5 % of
> >> >> >> that on average would give 6 W - using that to charge a c. 500 Wh PBa
> >> >> >> battery to run a SoM that consumes several hundred mW in sleep / suspend
> >> >> >> mode should give enough margin for safety during spring, summer and
> >> >> >> autumn - and perhaps winter - the measurement equipment hasn't arrived
> >> >> >> yet to test this though.
> >> >> >>
> >> >> >> > One last (smaller) technical comment: you mention ACPI system states (S0
> >> >> >> > to S4), while none the the SoCs you list use ACPI.
> >> >> >>
> >> >> >> Thanks - I'd just been asking the SoM and SoC manufacturers about a "low
> >> >> >> power sleep / suspend mode" not about ACPI specifically and I'd mixed
> >> >> >> the two up. I'm cautious about committing to this until I've checked
> >> >> >> whether the measured power consumption matches the manufacturer's stated
> >> >> >> power consumption.
> >> >> >
> >> >> > Real life measurement are definitely needed.
> >> >> >
> >> >> >> On 2025-01-01 23:08, Laurent Pinchart wrote:
> >> >> >> > On Tue, Dec 31, 2024 at 02:45:44PM +0000, w.robertson at cairnwater.com
> >> >> >> > wrote:
> >> >> >> >> Hi Laurent, Jacopo and Kieran,
> >> >> >> >>
> >> >> >> >> All my best wishes to you for 2025!
> >> >> >> >>
> >> >> >> >> I put together this overview summary of CritterCam that hopefully
> >> >> >> >> explains how the engineering and economics of the situation make it
> >> >> >> >> feasible - let me know if it would be of any help to you as an example
> >> >> >> >> use case (it might be a good one because I'm not trying to sell anything
> >> >> >> >> or make a sales pitch but I am trying to take the speed and aggression
> >> >> >> >> of modern software engineering to an until-now far-too-sleepy area of
> >> >> >> >> electronics):
> >> >> >> >>
> >> >> >> >> https://bitbucket.org/WillRobertRobertson/crittercam/src/main/README.md
> >> >> >> >
> >> >> >> > I went through the document. The project sounds very ambitious, I think
> >> >> >> > you will need to carefully plan intermediate steps on your way to the
> >> >> >> > end goal.
> >> >> >> >
> >> >> >> > One point that you may have overlooked is the design of the camera
> >> >> >> > pipeline. Roughly speaking, to capture processed image, you can use
> >> >> >> >
> >> >> >> > - A raw camera sensor connected to an SoC with an ISP (ideally hardware,
> >> >> >> >   possibly software with GPU offload at the cost of higher power
> >> >> >> >   consumption).
> >> >> >> >
> >> >> >> > - A raw camera sensor connected to a standalone ISP, itself connected to
> >> >> >> >   an SoC without an ISP (but with a camera capture interface, typically
> >> >> >> >   a CSI-2 or parallel receiver). This increases the cost and power
> >> >> >> >   consumption due to the external ISP. Wake up latencies can also be
> >> >> >> >   affected.
> >> >> >> >
> >> >> >> > - A camera sensor that integrates an ISP (a.k.a. smart sensor or YUV
> >> >> >> >   sensor) connected to an SoC without an ISP (but with a camera capture
> >> >> >> >   interface, typically a CSI-2 or parallel receiver).
> >> >> >> >
> >> >> >> > Most (if not all) of the camera sensors you list are raw camera sensors,
> >> >> >> > and most of the SoCs you list either don't include an ISP (SAMA7G54,
> >> >> >> > R2/G2, AM625, STM32N6, RT1170) or have an ISP that is not supported yet
> >> >> >> > but upstream kernel drivers and by libcamera (AM67A, TDA4VM, Genio510).
> >> >> >> > It isn't clear to me how you plan to address that issue.
> >> >> >> >
> >> >> >> > Wake up latency is also something that will need to be considered very
> >> >> >> > carefully. If your power consumption requirements require powering down
> >> >> >> > the SoC and DRAM completely, the system will need to cold-boot when
> >> >> >> > woken up, which will probably take too long. Some workarounds are
> >> >> >> > possible but may require very extensive software development efforts,
> >> >> >> > and those workaround may not be portable across different SoCs.
> >> >> >> >
> >> >> >> > One last (smaller) technical comment: you mention ACPI system states (S0
> >> >> >> > to S4), while none the the SoCs you list use ACPI.
> >> >> >> >
> >> >> >> > Please see below for more comments.
> >> >> >> >
> >> >> >> >> The in-depth analysis behind that is in separate files (far too long to
> >> >> >> >> include in the summary) that're in a private repository for now because
> >> >> >> >> they have email addresses, etc.  (Think I'm developing a preference for
> >> >> >> >> GitHub over BitBucket...)
> >> >> >> >>
> >> >> >> >> Also translated Goedele's most recent article to English :-)
> >> >> >> >>
> >> >> >> >> https://new-homes-for-old-friends.cairnwater.com/carved-tree-cavities-for-endangered-dormice/
> >> >> >> >>
> >> >> >> >> All the best for 2025!
> >> >> >> >>
> >> >> >> >> Will
> >> >> >> >>
> >> >> >> >> On 2024-12-22 19:22, w.robertson at cairnwater.com wrote:
> >> >> >> >> > Hi Laurent, Jacopo and Kieran,
> >> >> >> >> >
> >> >> >> >> > I was trying to think of ways that I could help out. All my current
> >> >> >> >> > experience is server side Python/.Net/SQL ( www.faclair.com is one
> >> >> >> >> > project) and as a climbing arborist so I don't have any kernel
> >> >> >> >> > experience.
> >> >> >> >> >
> >> >> >> >> > I'm compiling a list of SoMs, SiPs and SBCs that should in theory work
> >> >> >> >> > with libcamera with relevant information from their technical data
> >> >> >> >> > sheets and questions in emails to their manufacturers - at the moment
> >> >> >> >> > this is in CritterCam's git repository but I could make this available
> >> >> >> >> > for other libcamera users and testers if this would be a help?
> >> >> >> >> >
> >> >> >> >> > I looked at using a μC with a MIPI CSI-2 interface.
> >> >> >> >> >
> >> >> >> >> > From a conservation perspective, the research has been very successful
> >> >> >> >> > but trying to fund it from my commercial work has been financially
> >> >> >> >> > disastrous - I'm moving out of my flat and into my van to save money in
> >> >> >> >> > the hope that enough funding to survive might become available next
> >> >> >> >> > year - if I tried to port libcamera to a μC I'd go bankrupt long before
> >> >> >> >> > the work could be completed.
> >> >> >> >> >
> >> >> >> >> > The need for CritterCam to outlast all currently available silicon -
> >> >> >> >> > keeping up with very rapidly evolving processor and sensor technology -
> >> >> >> >> > and to support multiple contributors contributing both software
> >> >> >> >> > features and add-on off-the-shelf and custom electronic modules to a
> >> >> >> >> > single open source repository also encouraged me to look at a μP and
> >> >> >> >> > embedded Linux instead of a μC.
> >> >> >> >> >
> >> >> >> >> > The only ways I can see at the moment of having any possibility of
> >> >> >> >> > getting a high quality camera working on a μC would be to work together
> >> >> >> >> > with Raspberry Pi or ST.
> >> >> >> >> >
> >> >> >> >> > A talk by Naushir Patuck from Raspberry Pi in November 2024 mentioned
> >> >> >> >> > that Raspberry Pi were considering the possibility of a μC with MIPI
> >> >> >> >> > CSI-2 support:
> >> >> >> >> >
> >> >> >> >> > https://www.digikey.com/en/blog/webinar-introduction-raspberry-pi-imaging-and-computer-vision-tools
> >> >> >> >> >
> >> >> >> >> > ST seem to have very recently unveiled the STM32N6x7 (datasheet v1
> >> >> >> >> > November 2024) and STM32N6x5 (datasheet link gives 404 error)
> >> >> >> >> > Cortex-M55 μC with MIPI CSI-2 support:
> >> >> >> >> >
> >> >> >> >> > https://www.st.com/en/microcontrollers-microprocessors/stm32n6-series.html
> >> >> >> >> >
> >> >> >> >> >> the i.MX8MP, for instance, has a Cortex M core
> >> >> >> >> >
> >> >> >> >> > The STM32MP25 also includes a Cortex M33 - so the i.MX8MP and STM32MP25
> >> >> >> >> > raise the theoretical possibility of having both a low-power μC
> >> >> >> >> > solution and a more power hungry but much more flexible μP solution
> >> >> >> >> > that both use the same silicon - avoiding the potentially costly
> >> >> >> >> > headache of CritterCam maintaining seperate low power μC and high
> >> >> >> >> > flexibility μP forks - Cortex M can't support CSI-2 so would be limited
> >> >> >> >> > to simpler camera modules.
> >> >> >> >
> >> >> >> > Whether or not you can control the camera hardware (CSI-2 receiver, ISP,
> >> >> >> > DMA engines, ...) from a Cortex M core depends on the integration of
> >> >> >> > those peripherals in a particular SoC, and thus varies between SoC
> >> >> >> > models. It's not an intrinsic property of Cortex M that it can't support
> >> >> >> > CSI-2.
> >> >> >> >
> >> >> >> >> >> I presume you would have more luck asking on the rpi forums.
> >> >> >> >> >
> >> >> >> >> > Every time the need for a low power sleep or suspend mode has been
> >> >> >> >> > raised on the RPi forums the answer has been a very decisive "No.
> >> >> >> >> > Switch it off and reboot." - this is a massive problem that renders RPi
> >> >> >> >> > SBCs effectively useless for any power sensitive application - it lead
> >> >> >> >> > me to look at silicon from ST and NXP instead. I keep hoping that
> >> >> >> >> > something might appear from RPi but nothing has.
> >> >> >> >> >
> >> >> >> >> >> Yes, to realize and tune a camera system you need knowledge of both
> >> >> >> >> >> the ISP, the image sensor and the optics.
> >> >> >> >> >>
> >> >> >> >> >> If a vendor instead provide tuning data pre-calculated for
> >> >> >> >> >> a specific set of camera modules designed to work with their platforms
> >> >> >> >> >> then yes, you would have a choice of pre-tuned cameras to pick from.
> >> >> >> >> >
> >> >> >> >> > Thinking about this - I used to work in the physics, chemistry and
> >> >> >> >> > nonlinear optics of optoelectronic materials - I don't know if this
> >> >> >> >> > could be any help or not - the thought went through my mind of
> >> >> >> >> > developing a small optical bench on which an image sensor and any
> >> >> >> >> > associated optics could be placed then red, green, blue and IR lasers
> >> >> >> >> > modulated and scanned over both it's surface and possibly a known
> >> >> >> >> > reference image sensor while a computer reads data from it so that
> >> >> >> >> > precise calibration and tuning data for that particular make and model
> >> >> >> >> > of image sensor and any associated optics could be determined and open
> >> >> >> >> > sourced?
> >> >> >> >
> >> >> >> > Calibration and tuning for camera modules requires a test bench, but I
> >> >> >> > don't think lasers would help. See chapter 6 of
> >> >> >> > https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf
> >> >> >> > for an example of a tuning procedure and the corresponding tools and
> >> >> >> > environment.
> >> >> >> >
> >> >> >> >> > I don't know if this would be of use or not (perhaps using
> >> >> >> >> > this data to train a simple NN could be a way of implementing
> >> >> >> >> > calibration and tuning algorithms if NPUs or GPUs become fast enough -
> >> >> >> >> > NPUs and GPUs might be good at handling the simple but highly parallel
> >> >> >> >> > tensor operations required without proprietory tie-in)?
> >> >> >> >> >
> >> >> >> >> > This might allow some flaws and limitations inherent in the
> >> >> >> >> > manufacturing of the sensor and optics to be compensated for through
> >> >> >> >> > firmware tuning and calibration?
> >> >> >> >
> >> >> >> > Flaws such as defective pixels, or intrinsic physical properties of the
> >> >> >> > camera module that degrate the image quality (noise, lens shading, ...)
> >> >> >> > are typically corrected by the ISP, based on calibration and tuning
> >> >> >> > data.
> >> >> >> >
> >> >> >> >> >> The implementation of
> >> >> >> >> >> the algorithms, how they use and combine features of different IPs on
> >> >> >> >> >> the platforms (gpu, other accelerators etc) and the tuning of the whole
> >> >> >> >> >> camera pipeline is usually what SoC vendors compete on.
> >> >> >> >> >
> >> >> >> >> >> Sensor driver manufacturers are not involved when it comes to ISP
> >> >> >> >> >> algorithms development. Knowledge of the ISP, its design and
> >> >> >> >> >> documentation are usually only accessible to the SoC vendors (again
> >> >> >> >> >> apart from RPi that fully documents their ISP publicly).
> >> >> >> >> >
> >> >> >> >> > Thank you ver much - it's a big help for me to understand that - so
> >> >> >> >> > basically one way that businesses like ST and NXP compete with each
> >> >> >> >> > other is by competing to provide the best ISP algorithms as well as the
> >> >> >> >> > best silicon?
> >> >> >> >> >
> >> >> >> >> >> The RPi pipeline handler implements a selection logic to pick the
> >> >> >> >> >> "best"
> >> >> >> >> > sensor configuration given a desired output streams configuration:
> >> >> >> >> > https://git.libcamera.org/libcamera/libcamera.git/tree/src/libcamera/pipeline/rpi/common/pipeline_base.cpp#n937
> >> >> >> >> >
> >> >> >> >> >> libcamera offers an API to override the pipeline handler choice, so I
> >> >> >> >> >> would say your application is free to chose any configuration
> >> >> >> >> >> offered by the sensor's driver.
> >> >> >> >> >
> >> >> >> >> > Thank you very much - that's an enormous help - I'd been confused by
> >> >> >> >> > that. Typically commercial wildlife cameras boast the number of pixels
> >> >> >> >> > - often including nonexistent "interpolated pixels" - whereas the big
> >> >> >> >> > limiting factor on the image quality is the signal to noise ratio on
> >> >> >> >> > those pixels so binning may be very valuable - particularly in low
> >> >> >> >> > light conditions or when processing power and storage are limited.
> >> >> >> >> >
> >> >> >> >> >> Some vendors
> >> >> >> >> >> decide to distribute their "advanced" algorithms as binaries through
> >> >> >> >> >> different channels and support their platforms in mainline libcamera with
> >> >> >> >> >> a reduced number of features in the source implementation.
> >> >> >> >> >
> >> >> >> >> > Why don't they contribute their "advanced" proprietory features as
> >> >> >> >> > obfuscated and compiled binary plugins for libcamera?
> >> >> >> >
> >> >> >> > They can do that, and some vendors are looking at this for their newest
> >> >> >> > SoCs. I believe ST will have a closed-source plugin (we call those IPA
> >> >> >> > modules in libcamera, for Image Processing Algorithms) for the STM32MP2
> >> >> >> > (don't quote me on that though). There will be more vendors adopting
> >> >> >> > that approach.
> >> >> >> >
> >> >> >> >> >> Maybe that's why I've never seen a flying squirrel in Nuuksio :-(
> >> >> >> >> >
> >> >> >> >> > The bad news is that their numbers have been dropping very rapidly. The
> >> >> >> >> > good news is that research into Pteromys volans by Ralf Wistbacka and
> >> >> >> >> > others and into Red Squirrels, Hazel Dormice and Garden Dormice by
> >> >> >> >> > Goedele Verbeylen, Thomas Briner and others has identified lack of
> >> >> >> >> > suitable nesting holes as a significant cause for this decline and that
> >> >> >> >> > we now have very fast, precise and scalable chainsaw and rotary carving
> >> >> >> >> > techniques to create nest holes for them. The other good news is that
> >> >> >> >> > Goedele Verbeylen and her colleagues have brought Garden Dormice back
> >> >> >> >> > from the brink of extinction in Belgium, proving that it can be done.
> >> >> >> >> >
> >> >> >> >> >> If you need to
> >> >> >> >> >> operate with really low power, keeping the Cortex A cores and DRAM
> >> >> >> >> >> powered, even in low-power mode, may go over your power budget.
> >> >> >> >> >
> >> >> >> >> > Yup. I'm keeping a close eye on reported power consumption for SoMs and
> >> >> >> >> > I'll measure power consumption. I'm also trying to keep everything
> >> >> >> >> > small to fit into a small amount of RAM and tentatively rejecting any
> >> >> >> >> > board that uses DDR4 instead of LPDDR4.
> >> >> >> >> >
> >> >> >> >> > The power available is limited by the amount of sun, size of solar
> >> >> >> >> > panels, size of rechargeable batteries and battery chemistry (Li and
> >> >> >> >> > LiFePO4 batteries work well in summer but only PbA batteries work below
> >> >> >> >> > a few °C). It may be that tiny wind generators could be added in
> >> >> >> >> > difficult conditions. Like any tree dwelling creature, CritterCam has
> >> >> >> >> > to be very light, flexible and adaptable to be successful.
> >> >> >> >> >
> >> >> >> >> > Simply by allowing the battery, solar panel and the μP to be in
> >> >> >> >> > seperate boxes, allowing videos and photos to be optionally compressed
> >> >> >> >> > and allowing WiFi or Ethernet to be switched on to download data,
> >> >> >> >> > CritterCam will remove the need for conservationists and researchers to
> >> >> >> >> > risk their lives on wobbly ladders changing batteries and SD cards.
> >> >> >> >
> >> >> >> > Modular designs are always tempting, but they come at a cost. I would
> >> >> >> > advice, for the first version, focussing on the components where
> >> >> >> > modularity would bring the best added value, and keeping the other
> >> >> >> > components in a more monolithic design. An external solar panel seems to
> >> >> >> > make sense, and making the connection waterproof shouldn't be too
> >> >> >> > difficult. An external camera module, on the other hand, would be more
> >> >> >> > difficult, with (in my opinion) less added value. You could still make
> >> >> >> > the design within the main enclosure modular to let users pick among
> >> >> >> > different camera sensors, or decide what communication module(s) they
> >> >> >> > want to install.
> >> >> >> >
> >> >> >> >> > An energy and power meter I ordered should arrive in a week or two and
> >> >> >> >> > I'll start measuring how much power a solar panel can give in the
> >> >> >> >> > typical worst case of a forest floor.
> >> >> >> >> >
> >> >> >> >> > When things get really desperate an optional GSM module could be
> >> >> >> >> > switched on to send a simple SMS or MQTT message to ask someone to come
> >> >> >> >> > and change a low battery for a fully charged one and consider adding
> >> >> >> >> > more solar panels.
> >> >> >> >> >
> >> >> >> >> >> C-PHY is more recent, it can provide the same bandwidth with a lower
> >> >> >> >> >> number of signals, but isn't as widely supported (I expect that to
> >> >> >> >> >> improve over time).
> >> >> >> >> >
> >> >> >> >> > That's good to know - C-PHY's 3 level signalling must be giving big
> >> >> >> >> > headaches for silicon designers used to working with digital signals.
> >> >> >> >> > Good to know that for the time being D-PHY is the best supported.
> >> >> >> >> >
> >> >> >> >> >> CSI-3 was killed by Intel and Qualcomm before it reached the market
> >> >> >> >> >
> >> >> >> >> > That's good to know. One less thing to keep track of. CSI-1 and CSI-3
> >> >> >> >> > RIP.
> >> >> >> >> >
> >> >> >> >> > Will
> >> >> >> >> >
> >> >> >> >> > On 2024-12-22 10:30, Laurent Pinchart wrote:
> >> >> >> >> >> Hi Will,
> >> >> >> >> >>
> >> >> >> >> >> On Fri, Dec 20, 2024 at 11:04:09AM +0000, w.robertson at cairnwater.com wrote:
> >> >> >> >> >>> Hello Laurent and Kieran,
> >> >> >> >> >>>
> >> >> >> >> >>> > This sounds very interesting. We rarely get contacted by people working
> >> >> >> >> >>> > on projects related to animals or nature. I live in Finland, so I would
> >> >> >> >> >>> > be happy to help the flying squirrels from Nuuksio :-)
> >> >> >> >> >>>
> >> >> >> >> >>> That's wonderful! Dormouse species hibernate but the flying squirrels
> >> >> >> >> >>> can't hibernate and so must have nest holes which provide protection
> >> >> >> >> >>> from weather and allow them to build well insulated nests to survive the
> >> >> >> >> >>> winter, they also ideally need nest holes with entrances which are the
> >> >> >> >> >>> right size to let them in but keep the larger red squirrels out. Past
> >> >> >> >> >>> forestry practice was to remove dead trees and trees with holes (which
> >> >> >> >> >>> were seen as unsightly or unstable) and this has left a drastic shortage
> >> >> >> >> >>> of suitable nest holes for the flying squirrels.
> >> >> >> >> >>
> >> >> >> >> >> Maybe that's why I've never seen a flying squirrel in Nuuksio :-(
> >> >> >> >> >>
> >> >> >> >> >>> Most of my research is unpaid (with occasional small sponsorship here
> >> >> >> >> >>> and there) so resources have to be used very efficiently.
> >> >> >> >> >>>
> >> >> >> >> >>> By developing new rotary boring and precision chainsaw carving
> >> >> >> >> >>> techniques we can carve nest holes entirely through a single narrow,
> >> >> >> >> >>> very precise entrance with a much larger nesting chamber behind.
> >> >> >> >> >>>
> >> >> >> >> >>> > I'll try to help, in my (unfortunately limited) free time. The best
> >> >> >> >> >>> > option, if compatible with your needs, would be to have open
> >> >> >> >> >>> > discussions
> >> >> >> >> >>> > either on the libcamera development mailing list, or on the project's
> >> >> >> >> >>> > IRC channel (you can find contact information in
> >> >> >> >> >>> > https://libcamera.org/contributing.html). That way other developers
> >> >> >> >> >>> > will also be able to chime in.
> >> >> >> >> >>>
> >> >> >> >> >>> That would be wonderful!
> >> >> >> >> >>>
> >> >> >> >> >>> > libcamera doesn't currently support the STM32MP2, but ST is working on
> >> >> >> >> >>> > it. They showcased a working prototype at Embedded World in April this
> >> >> >> >> >>> > year, and I assume they will provide a version usable in production
> >> >> >> >> >>> > soon. While they haven't started upstreaming much of their code in
> >> >> >> >> >>> > libcamera, I believe they plan to do so. The platform is therefore
> >> >> >> >> >>> > interesting for low-power applications.
> >> >> >> >> >>>
> >> >> >> >> >>> Thank you very much - I'd been unsure about that - that's very helpful
> >> >> >> >> >>> to know - I'll maybe try to find a way to give ST some gentle
> >> >> >> >> >>> encouragement in the spring - they're pushing aggressively to sell both
> >> >> >> >> >>> their image sensors and STM32MP2 so libcamera support should be a high
> >> >> >> >> >>> strategic priority for them. I get the feeling that having launched
> >> >> >> >> >>> STM32MP2 into both industrial control and domestic appliance markets
> >> >> >> >> >>> they've got a fairly heavy workload at the moment and the STM32MP257F-DK
> >> >> >> >> >>> dev. board is due to be launched later than planned.
> >> >> >> >> >>>
> >> >> >> >> >>> > The i.MX8MP is much better supported in libcamera at the moment, and we
> >> >> >> >> >>> > are actively improving its support. There are low-cost development
> >> >> >> >> >>> > boards, such as the Debix Model A that we use for development (note that
> >> >> >> >> >>> > its camera connector is not standard, but they have a Debix I/O board
> >> >> >> >> >>> > that can interface the Model A camera connector to a Raspberry Pi 15
> >> >> >> >> >>> > pins connector).
> >> >> >> >> >>>
> >> >> >> >> >>> Thank you very much! That's wonderful to have a good dev. board! I'd
> >> >> >> >> >>> been looking at i.MX8MP SoMs and SiPs but some of them still seem to be
> >> >> >> >> >>> in the design stage with some technical information missing from data
> >> >> >> >> >>> sheets. Will Debix Model A's LPDDR4 allow it to quickly enter and waken
> >> >> >> >> >>> from a low power sleep mode?
> >> >> >> >> >>
> >> >> >> >> >> I haven't experienced with low power sleep and wake up time myself, so I
> >> >> >> >> >> can't tell.
> >> >> >> >> >>
> >> >> >> >> >> Generally speaking, it all depends on your power budget. If you need to
> >> >> >> >> >> operate with really low power, keeping the Cortex A cores and DRAM
> >> >> >> >> >> powered, even in low-power mode, may go over your power budget. Other
> >> >> >> >> >> options are possible, such as turning power to the Cortex A and DRAM
> >> >> >> >> >> off, and handling fast wake up tasks in a smaller core (the i.MX8MP, for
> >> >> >> >> >> instance, has a Cortex M core that could be used for this). The wake up
> >> >> >> >> >> time gets significantly increased, as Linux will need to boot from
> >> >> >> >> >> scratch. You may be able to start powering up the camera sensor in
> >> >> >> >> >> parallel from the Cortex M core to save some time, or even drive the
> >> >> >> >> >> camera completely from the Cortex M, but that would require porting
> >> >> >> >> >> libcamera (and drivers) to whatever OS you would be running there, which
> >> >> >> >> >> would require really significant effort.
> >> >> >> >> >>
> >> >> >> >> >>> The 4 lane MIPI CSI seems potentially more
> >> >> >> >> >>> flexible than the STM32MP2's 2 lane CSI-2 interface.
> >> >> >> >> >>
> >> >> >> >> >> 4 lanes will give you more bandwidth, but unless you target really high
> >> >> >> >> >> resolutions at high frame rates, that shouldn't be necessary.
> >> >> >> >> >>
> >> >> >> >> >>> I guess the best way for me to get a Debix Model A or B is just to order
> >> >> >> >> >>> it from RS?
> >> >> >> >> >>
> >> >> >> >> >> I believe so.
> >> >> >> >> >>
> >> >> >> >> >>> (One thing that's confusing me - when modern data sheets like the Debix
> >> >> >> >> >>> Model A data sheet say "MIPI CSI 4-lane" I guess they mean "MIPI CSI-2"?
> >> >> >> >> >>> I'd heard of CSI-1 which supports only 1 lane and seems old and rarely
> >> >> >> >> >>> used now, of CSI-2 which supports multiple lanes and theoretically a
> >> >> >> >> >>> choice of C-PHY of D-PHY physical layers (though I guess many CSI-2
> >> >> >> >> >>> implementations maybe don't implement C-PHY because of its complexity?)
> >> >> >> >> >>> for the physical layer and CSI-3 which is based on UniPro and rarely
> >> >> >> >> >>> implemented at the moment.)
> >> >> >> >> >>
> >> >> >> >> >> Today the terms "MIPI CSI" (or sometimes just "MIPI camera") refers to
> >> >> >> >> >> CSI-2. CSI-1 has retired a very long time ago, and CSI-3 was killed by
> >> >> >> >> >> Intel and Qualcomm before it reached the market (there were some
> >> >> >> >> >> prototype implementations, but nothing in mass production).
> >> >> >> >> >>
> >> >> >> >> >> D-PHY is the most common PHY for CSI-2, and should be supported
> >> >> >> >> >> everywhere. C-PHY is more recent, it can provide the same bandwidth with
> >> >> >> >> >> a lower number of signals, but isn't as widely supported (I expect that
> >> >> >> >> >> to improve over time). I don't think the choice of PHY matters too much
> >> >> >> >> >> in your case, whatever is supported by both the camera sensor and the
> >> >> >> >> >> SoC will be fine.
> >> >> >> >> >>
> >> >> >> >> >>> Thank you very much for all your help!
> >> >> >> >> >>>
> >> >> >> >> >>> Will
> >> >> >> >> >>>
> >> >> >> >> >>> On 2024-12-19 23:17, Laurent Pinchart wrote:
> >> >> >> >> >>> > Hello Will,
> >> >> >> >> >>> >
> >> >> >> >> >>> > (CC'ing my colleague Kieran)
> >> >> >> >> >>> >
> >> >> >> >> >>> > On Thu, Dec 19, 2024 at 07:46:53PM +0000, w.robertson at cairnwater.com wrote:
> >> >> >> >> >>> >> Hi Laurent
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> Thank you very much for your video "Giving Linux a Camera Stack:
> >> >> >> >> >>> >> libcamera's 3 Years Journey and Exciting Future" and the wonderful work
> >> >> >> >> >>> >> of you and the libcamera developers in bringing such a powerful and
> >> >> >> >> >>> >> structured approach to such an extremely difficult and complex
> >> >> >> >> >>> >> problem.
> >> >> >> >> >>> >
> >> >> >> >> >>> > You're more than welcome.
> >> >> >> >> >>> >
> >> >> >> >> >>> >> I work as a climbing arborist in practice and research and software
> >> >> >> >> >>> >> engineer. My research is mostly voluntary and focused on new minimally
> >> >> >> >> >>> >> invasive techniques to carve critically needed nest holes for endangered
> >> >> >> >> >>> >> tree dwelling dormouse species (Hazel Dormouse, Garden Dormouse, Forest
> >> >> >> >> >>> >> Dormouse), bat species and in Finland the European Flying Squirrel
> >> >> >> >> >>> >> (Pteromys volans).
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> To overcome problems and limitations with commercial wildlife cameras
> >> >> >> >> >>> >> for small mammals I started work on CritterCam which - if I can get it
> >> >> >> >> >>> >> to work - will be a fully open source wildlife camera system which is
> >> >> >> >> >>> >> vastly better, vastly more flexible, vastly more independent in the
> >> >> >> >> >>> >> field and significantly cheaper than anything commercially available at
> >> >> >> >> >>> >> the moment. Working with embedded Linux and with several different
> >> >> >> >> >>> >> manufacturers for the microprocessor and image sensors for CritterCam
> >> >> >> >> >>> >> means that CritterCam users can never be price gouged and CritterCam
> >> >> >> >> >>> >> will be able to adapt to update all of its critical components with
> >> >> >> >> >>> >> newer alternatives well into the future.
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> This is an article with some videos of Garden Dormice exploring our
> >> >> >> >> >>> >> carved nest holes published in Flemish yesterday by Goedele Verbeylen:
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> https://www.natuurpunt.be/nieuws/boomholtes-kerven-voor-bedreigde-eikelmuizen
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> This is a short summary that I wrote in English and German:
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> https://drive.google.com/file/d/1jkFA-PgDr-O9-nGSgODCUFkukj63F4fc/view?usp=sharing
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> We have a small website at:
> >> >> >> >> >>> >>
> >> >> >> >> >>> >> new-homes-for-old-friends.cairnwater.com
> >> >> >> >> >>> >
> >> >> >> >> >>> > This sounds very interesting. We rarely get contacted by people working
> >> >> >> >> >>> > on projects related to animals or nature. I live in Finland, so I would
> >> >> >> >> >>> > be happy to help the flying squirrels from Nuuksio :-)
> >> >> >> >> >>> >
> >> >> >> >> >>> >> My background is mainly server side so I don't have kernel experience
> >> >> >> >> >>> >> and I was wondering if you or another libcamera might be willing to
> >> >> >> >> >>> >> give
> >> >> >> >> >>> >> me any pointers to help me get started using supported MIPI CSI-2
> >> >> >> >> >>> >> cameras with embedded Linux or ANdroid?
> >> >> >> >> >>> >
> >> >> >> >> >>> > I'll try to help, in my (unfortunately limited) free time. The best
> >> >> >> >> >>> > option, if compatible with your needs, would be to have open
> >> >> >> >> >>> > discussions
> >> >> >> >> >>> > either on the libcamera development mailing list, or on the project's
> >> >> >> >> >>> > IRC channel (you can find contact information in
> >> >> >> >> >>> > https://libcamera.org/contributing.html). That way other developers
> >> >> >> >> >>> > will
> >> >> >> >> >>> > also be able to chime in.
> >> >> >> >> >>> >
> >> >> >> >> >>> >> I'm hoping to build a first prototype using the STM32MP257F-DK dev.
> >> >> >> >> >>> >> board based on the STM32MP257 μP when it becomes available in January
> >> >> >> >> >>> >> but I'm also looking at SoMs and SiPs based on other μPs like the NXP
> >> >> >> >> >>> >> i.MX 8M Plus that support a CSI-2 interface and a low power sleep mode
> >> >> >> >> >>> >> -
> >> >> >> >> >>> >> I can also do some initial testing using a Raspberry Pi SBC.
> >> >> >> >> >>> >
> >> >> >> >> >>> > libcamera doesn't currently support the STM32MP2, but ST is working on
> >> >> >> >> >>> > it. They showcased a working prototype at Embedded World in April this
> >> >> >> >> >>> > year, and I assume they will provide a version usable in production
> >> >> >> >> >>> > soon. While they haven't started upstreaming much of their code in
> >> >> >> >> >>> > libcamera, I believe they plan to do so. The platform is therefore
> >> >> >> >> >>> > interesting for low-power applications.
> >> >> >> >> >>> >
> >> >> >> >> >>> > The i.MX8MP is much better supported in libcamera at the moment, and we
> >> >> >> >> >>> > are actively improving its support. There are low-cost development
> >> >> >> >> >>> > boards, such as the Debix Model A that we use for development (note
> >> >> >> >> >>> > that
> >> >> >> >> >>> > its camera connector is not standard, but they have a Debix I/O board
> >> >> >> >> >>> > that can interface the Model A camera connector to a Raspberry Pi 15
> >> >> >> >> >>> > pins connector).
> >> >> >> >> >>> >
> >> >> >> >> >>> > Raspberry Pi platforms are also well supported, with the work being
> >> >> >> >> >>> > performed by Raspberry Pi.

-- 
Regards,

Laurent Pinchart


More information about the libcamera-devel mailing list