Running local AI models on SBCs has been a thing for a while now, with many boards even featuring built-in NPUs or other hardware specifically catered for AI workflows. The trusty Raspberry Pi lineup had none of that — until now. The official Raspberry Pi AI Kit combines a first-party M.2 HAT with the Hailo-8L AI accelerator module to bring the Raspberry Pi 5 up to speed for just $70.
On NPUs and SBCs…
Artificial intelligence is not just about flashy, cutting-edge generative models running on extremely powerful enterprise hardware, churning data and answering life’s big (and not-so-big) questions. Just as important is a concept called on-device AI. The name sums the idea up quite succinctly: it’s all about running AI models right on end-user or edge hardware instead of relying on cloud servers.
AI workflows rely on parallel computing. In traditional systems, this is handled by the GPU. And while GPUs are still the gold standard when it comes to running and training AI models, they tend to draw hundreds of watts. That’s okay for servers and desktops, but far from ideal for portable or embedded systems.
There’s another, newer approach – NPUs. These come in a variety of flavors: they can be embedded right into the SoC or come as standalone accelerator chips or modules. These are simpler devices, and unlike dedicated GPUs, often come without VRAM of their own, instead sharing available system memory much like an integrated GPU does. Their performance is limited, with current highest-end models delivering ~45 TOPS. However, NPUs are incredibly power efficient, usually only drawing a watt or two.
With this kind of performance, it’s clear that training models is a no-go, but that’s not what NPUs were designed for anyways. They are optimized for inferencing, enabling systems to efficiently handle certain AI workloads locally.
With companies cramming more and more AI-powered features into their products, NPUs are becoming increasingly relevant in all kinds of hardware — and they’ve already been ubiquitous in phones for a number of years. A lot of functionality w take for granted nowadays, like offline speech recognition, noise suppression, background blur in video calls or object detection in the camera app, is made possible by these little accelerators.
But why not run all these models on the cloud?
In consumer-grade products, it’s often about privacy. Having to beam a bunch of personal data to and from a remote server is inherently less secure than handling it locally. Biometric login systems are a great example of why this matters — would you really want a detailed scan of your face or fingerprint making its way around the globe, encrypted or not? And then there’s the connectivity concern. Designing critical system features around the notion that a device will never drop its internet connection is nearsighted at best.
IoT systems can also benefit from on-device AI (often called edge AI in this context). Having edge nodes send processed information instead of raw sensor data can often significantly simplify and reduce the amount of networking infrastructure required. The potential savings get quite significant when said raw data consists of bitstreams from cameras, LIDAR sensors or the like — as these can eat up a lot of bandwidth fast.
With a lot of developers excited to incorporate AI into their next project, and many competing SBCs already featuring embedded NPUs, it was due time for the Raspberry Pi to get some AI smarts of its own. Enter Raspberry Pi AI Kit.
Keep in mind that this kit is only compatible with the newest Raspberry Pi 5 as it requires native PCIe support, so if you don’t have one already, be sure to pick it up. For most of the demos you’ll require a Raspberry Pi Camera. Any should suffice, but we recommend picking up a standard Raspberry Pi Camera Module 3.
Before continuing, we’d like to thank our friends at Raspberry Pi for providing us with a review unit of the Raspberry Pi AI Kit (alongside a few more goodies required for the review). We’d also like to thank Hailo for sending in a Hailo-8 Starter Kit which will make a little guest appearance later in the review.
Hardware overview
The Raspberry Pi AI Kit is a cleverly designed package. Instead of reinventing the wheel, the team went for a more conservative approach, reusing the Raspberry Pi M.2 HAT+ and strapping a Hailo-8L AI accelerator module onto it. Hey, if it ain’t broke, don’t fix it.
The included Hailo-8L module is Hailo’s entry-level offering, delivering a modest 13 TOPS in a 2242-sized module. It connects to the Raspberry Pi using its M.2 B+M Key connector. The Hailo-8L is a cut-down variant of the “full” Hailo-8 accelerator. The latter is capable of delivering double the performance: 26 TOPS.
We were curious as to why Raspberry Pi went for the lower-end module, so we asked. We were told that it came down to “cost optimization, while providing sufficient performance for our target applications”. Well, that’s about what we expected, especially when there’s a $70 price point to match – and that includes the rest of the hardware in the kit.
Speaking of the rest of the hardware, the Raspberry Pi M.2 HAT+ nominally exposes a single PCIe 2.0 lane through an M.2 M Key slot. This can be bumped up to a single PCIe 3.0 lane by changing a few settings, which doubles the transfer speed to 1 GB/s. Official documentation warns that this might lead to some instability, but we’ve been running a Raspberry Pi 5 at PCIe 3.0 speeds in the SunFounder Pironman 5 case for quite some time with no issues whatsoever.
This is our first time with the first-party M.2 HAT+ though, but our experience with it so far has been good. We like the overall build quality and especially the snazzy little knurled thumbscrew used for securing the M.2 modules in place. While it’s probably best to leave the in-depth analysis for a (potential) separate article, we’ll still point out two limitations we feel are relevant to anyone eyeing this kit.
First of all, it’s important to keep in mind that there are only two mounting holes on the Raspberry Pi M.2 HAT+, meant for 2230- and 2242-sized modules. Not a big deal in the context of this kit, but it might slightly limit your options if you ever decide to use the M.2 HAT+ with an SSD instead of the Hailo accelerator.
Secondly, the physical design of the M.2 HAT+ means you’ll be sacrificing some functionality of your Raspberry Pi. If you’re using the official Raspberry Pi 5 Case, you’ll have to remove the built-in fan, which degrades thermal performance significantly. Sure, the svelte Raspberry Pi Active Cooler fits (and is probably your best bet for good thermals with the M.2 HAT+ on), but in order to leave enough of a gap so as not to completely block airflow, you’ll need to use the 16 mm standoffs included for this exact purpose. Sadly, the stacking header that comes with the kit is just 10 mm tall, which effectively leaves you with no GPIO access as the pins sit flush with the passthrough header on the M.2 HAT+ itself. We get that this was probably done for neatness’ sake, but still find it a questionable design decision, especially since robust GPIO features are a major reason people opt for systems like the Raspberry Pi in the first place.
You could bodge things a little and forgo using any sort of cooling, leaving the standoffs out (we do not recommend this, by the way). This way, you can kind of sandwich the Raspberry Pi 5 and the M.2 HAT+ together in such a way that there’s a bit of the stacking header sticking out, but at this point you’ll be burning both the BCM2712 and whatever module you’ve got mounted on the HAT+ alive.
The real solution would be purchasing your own 13.3 mm (or longer) stacking header. This isn’t that big of a deal, as it’s a pretty cheap buy, but we would have preferred seeing one included with the kit. Something about the design decision to effectively take away a feature and require a separate bit of kit to regain access to it just rubs us the wrong way. We’re well aware that Raspberry Pi products have always had very low profit margins and that every penny spent counts, but we’d gladly pay a couple of cents more for the kit as a whole knowing everything needed is included.
These two gripes aside (more so aimed at the M.2 HAT+ itself than the kit as a whole) we don’t have much else to complain about when it comes to the hardware.
With this out of the way, let’s actually assemble the Raspberry Pi AI Kit and run some demos!
Getting started with the Raspberry Pi AI Kit
The installation procedure is quite straightforward, especially as the Hailo-8L comes pre-installed on the Raspberry Pi M.2 HAT+.
This is when we found out that, apparently, the kit was supposed to ship with a thermal pad pre-fitted between the accelerator module and the HAT+. Well, ours didn’t!
We were ready to blame thermal-pad-loving goblins living under our lab desk, but still wanted to check what was going on. A mistake on the website listing? Something to do with the fact that we’re dealing with a pre-release unit? We asked Raspberry Pi about this and got confirmation that it’s the latter — retail units do in fact ship with a thermal pad. Mystery solved!
In order to minimize any potential risks, we decided to install our own thermal pad. After measuring the gap between the two surfaces, it seemed that a 1.5 mm pad would be ideal — and there was some Arctic TP-3 of the right thickness in the lab. This is a very soft thermal pad, which should help it make good contact with the uneven back surface of the Hailo-8L module. Craft knife in hand, it was time to get to work.
Well, that seems to have worked. It’s all nice and snug now! Admittedly, it might not be the neatest bit of handiwork ever, but it’ll help dissipate the heat and will (hopefully) closely emulate the thermal setup found on the retail version of the AI Kit.
Let’s kick the intended build process off by mounting the official Active Cooler. Every time we have to do this, we’re thankful that the Raspberry Pi 5 finally features dedicated mounting holes for heatsinks, as it makes the procedure so much more straightforward. Two push pins and one fan connector later, we were all set.
The next step was attaching the stacking header to the Raspberry Pi 5’s GPIO header and fastening the four standoffs. After this, it was a matter of connecting the HAT+ and the Raspberry Pi 5 using the supplied PCIe ribbon cable while making sure the headers were aligned correctly. It’s a bit fiddly, sure, but nothing a steady hand and a little patience can’t overcome.
The build process ends with the final four screws fastening everything in place. We had no issues with the nylon screws and spacers included with the kit, but there have been reports of early batches of the Raspberry Pi M.2 HAT+ having some defects. Raspberry Pi quickly acknowledged the issue and announced changes to future batches. It’s good to see these implemented in the unit that made its way to us.
With our little AI-enabled Raspberry Pi sandwich now assembled, it was time to boot the system up and configure the software. This is a pretty straightforward process — after making sure your system is up to date, all you need to do is enable PCIe 3.0 speeds by adding a line to the /boot/firmware/config.txt file (this is important, as it significantly boosts performance — more on this later), install the hailo-all software package and clone a GitHub repo.
To confirm that our Hailo-8L was being detected by the system, we ran the hailortcli command included in the software package we’ve just installed. Success! Our Hailo-8L was ready to go as evident from the command output below.
At this point, we’d usually get into a little more detail when it comes to software setup, especially as some manufacturers struggle a little with providing quality documentation. However, Raspberry Pi’s docs are always top-notch, so we’ll just point you to the official guide in case you’re interested in the exact steps here.
We’ve already mentioned that most of the demos require a Raspberry Pi as they’re vision based. This is something we embarrassingly overlooked right up until the point where we were supposed to actually launch one of the demos. And the camera was sitting in its box, right next to us the entire time! Whoops.
Installing the camera with the Raspberry Pi M.2 HAT+ in place proved to be a little tricky, so we’re using this chance to tell you to make sure you don’t make the same mistake we have — make sure you’ve got the camera connected before you put the HAT+ on your Raspberry Pi. While there is a nice cutout to allow the camera cables to pass through nicely, the area around the camera connectors is cramped, especially with the Raspberry Pi Active Cooler in place. We ended up having to use a toothpick to reach under and operate the latching mechanism, finally managing to secure the camera’s ribbon cable after a little persuasion and fiddling around.
Remember that GitHub repo? It’s the repository for rpicam-apps, a handy suite of camera apps. Here you’ll find all of the post-processing JSON files necessary for getting started with the Raspberry Pi AI Kit.
Getting into too much detail about the inner workings of rpicam-apps’ post-processing system is a bit off-topic in the context of this review, but it’s worth covering some of the basics. The post-processing pipeline is based on stages, which roughly correspond to the general idea of a camera filter. A stage can be as simple as a basic color inversion filter but can also feature complex compute operations and perform real-time image analysis.
All stages are built right into rpicam-apps. It’s possible to write custom ones in C++ by deriving from the PostProcessingStage class. In order to use these custom stages, however, you have to recompile the whole thing.
Each stage comes with a matching post-processing JSON file can get passed as a command line argument when starting any of the suite’s included apps. These contain the necessary initialization parameters required by the stage itself and also let the app know which stage(s) to use. When desired, it’s possible to create your own post-processing JSON files and configure any app to use a number of stages simultaneously.
We’re mentioning all this as it’s the system Raspberry Pi and Hailo have used to create the demos we’ll be testing today. There’s at least one new built-in post-processing stage, hailo_yolo_inference, which seems to be responsible for whirring up the Hailo-8L itself and loading up a model of choice.
As per Raspberry Pi’s official documentation, we’ll be running these using the rpicam-hello app as the frontend. It’s the simplest of the bunch, its only functionality being a live preview of whatever the camera is seeing.
Let’s run the YOLOv6 object detection model first. The command for that is:
rpicam-hello -t 0 –post-process-file ~/rpicam-apps/assets/hailo_yolov6_inference.json –lores-width 640 –lores-height 640
This demo is pretty simple: it puts boxes with labels around the items it detects and puts a percentage next to them, indicating how confident the model is in its predictions. By default, it runs at 30 FPS at a 640 x 640 resolution — and does so smoothly. Even more impressively, when we forced the demo to run at 60 FPS, there were no stutters or issues. The Hailo-8L kept up with the task at hand.
YOLOv6 was pretty decent at detecting things. It detected the keyboard sitting on our lab desk immediately, and although our iPad got labeled as a “laptop”, we can’t blame it — Apple’s Magic Keyboard really makes it look like one.
It also accurately detected our monitor (well, to be precise, it said it was a TV, but that’s the closest thing it was trained to detect).
There was a bit of funkiness once the model caught sight of our lab equipment. No, YOLO, that’s not a TV, and no, that’s absolutely not a microwave (yeah, we know it’s not been trained to detect these things, but it’s still funny). Come to think of it, perhaps soldering irons and bench power supplies are something an AI would rather pretend isn’t there. You might think it’s silly we’re mentioning this, but you’ll thank us once the AI apocalypse comes.
Don’t get your hopes up too much, though, as YOLOv6 is pretty good at detecting the tiniest amount of human in a given scene. Little bit of elbow showing? A few fingers peeking into the shot? Yeah, it’ll know you’re there. Even your reflections will give you out.
Okay, before we anger Googlebot too much by poking fun at its own kind, let’s move on to more demos. The same object detection demo is available in YOLOv8 and YOLOX flavors as well, as is a YOLOv5-based person and face detection model. These all provide a different balance of speed and accuracy.
The final two demos are YOLOv5-based image segmentation and YOLOv8-based full-body pose detection. Just like the previous demos, these run pretty well.
The keen eyed among you might have noticed a distinct lack of any concrete performance numbers. The reason is pretty simple: we found no way of actually quantifying any FPS figures or other performance metrics while running these demos, so you’ll have to just take our word that performance feels pretty great.
We did, however, check out resource usage in Raspberry Pi OS’ task manager, and were pleased to find that CPU load, during all of these, remained at under ~20% utilization. And even then, a good chunk of it was coming from the CPU having to render the live camera preview (and surprisingly, not so much from the Chromium browser we kept open in the background). Once we minimized it and enabled textual output from the model, usage dropped to ~10%. That’s almost an idle CPU, leaving plenty of space for any other software you might need to run. Impressive!
Further steps
Running those demos was nice! But what else can we do with the Raspberry Pi Ai Kit?
For starters, there’s the very handy GitHub repository with a lot of software examples tailored to the Raspberry Pi 5. There’s a lot of stuff to explore here, and it’s all pretty neat, ranging from simple object detection demos to tutorials on creating and running retrained models or optimizing and compiling your own using Hailo’s Dataflow Compiler. There is definitely a noticeable learning curve here, so get ready to set some time aside if you want to get the most out of the Hailo-8L.
Also, keep in mind that for retraining, you’ll need a desktop with enough oomph (or should we say, with a sufficiently beefy GPU). Hailo used an RTX4080 in their tutorials.
We especially liked the barcode and QR code detection demo that demonstrates just how much flexibility retrained models bring to the table. There’s also Hailo’s Model Zoo, a collection of ready-to-use models suitable for a wide range of applications.
Update (10/20/2024) – It seems some support for the Raspberry Pi AI Kit has since been since been added to the Picamera2 library as well. Check it out here.
However, Picamera2 support is still missing. As of writing, rpicam-apps is the only bit of Raspberry Pi’s software stack that’s deeply integrated with the Hailo accelerator. This means that it’s still a little tricky to programmatically access the module from your Python scripts. Hailo says that support for Picamera2 will be coming soon, and we hope that this opens up a whole new set of possibilities for utilizing the Hailo-8L through simple and streamlined Python calls.
Raspberry Pi AI Kit benchmarks
Leaving the performance assessment at “well, looks like it runs well” doesn’t seem quite right, so let’s actually try and get some quantifiable results. The benchmark we’ll be using today comes from Seeed Studio’s OSHW-RPi-Series GitHub repository, and measures YOLOv8 object detection and pose estimation performance on multiple platforms.
Let’s actually kick things off by running the benchmark on the Raspberry Pi 5’s CPU. Even though the BCM2712 is a good deal faster than its predecessors (and is a pretty snappy chip overall), inferencing and CPUs just aren’t a good match. The results illustrate this well: the YOLOv8 object detection model ran at 0.45 FPS, while the pose estimation model ran at 0.47 FPS (there’s a joke floating around our office about seconds per frame probably being a better metric in cases like this). Even worse, CPU utilization was pinned at 100% both times with the active cooler working overtime to keep the temps in check. The whole system was drawing ~13.3 W. Granted, this was measured at the wall, so the ~90% efficiency of the switching-mode power supply should be taken into account. This means that the Raspberry Pi itself was using around 12 W, though this also includes the mounted M.2 HAT+ and the idling Hailo-8L module. Shouldn’t make too much of a difference, but you can knock off around a watt if you really want to be safe with your estimations.
Running the same two tests on the Hailo-8L included with the Raspberry Pi AI Kit gave us very different results: 82.4 FPS and 66.1 FPS for object recognition and pose estimation, respectively. Even more impressive is the power draw — just 9.7 W! Just like in the demos, CPU usage was low, oscillating between 15% and 30%. This is quite massive — for $70, the AI Kit gives you essentially 200× the inferencing performance and does so more efficiently than the Raspberry Pi’s CPU.
Addendum: Hailo-8 on the Raspberry Pi 5 — and the story of PCIe connectivity
You might recall a mention of a “full” Hailo-8 accelerator way back, in the hardware overview section of the article. Unlike the Hailo-8L, which delivers 13 TOPS, the Hailo-8 delivers double: 26 TOPS. Both of these are based on the exact same architecture, and both run at 400 MHz (okay, well, some versions of the Hailo-8 run at 200 MHz, but that’s beside the point here), but the Hailo-8 features precisely twice as many compute elements as the Hailo-8L.
Well, we have a Hailo-8 module in the lab, and we wanted to see if it’d work with the Raspberry Pi AI Kit. At a quick glance, these two modules seem virtually identical, and the folks over at Raspberry Pi assured us our little experiment should work, but the software itself might spew a few errors here and there. That’s okay, we’ve dealt with errors before.
Since the Hailo-8 offers exactly twice the performance of the Hailo-8L, it would make sense that every benchmark number gets roughly doubled, right? Well — yes and no. While there is a lot more compute in the Hailo-8, there’s another side to the story. It’s got to do with memory and data transfer rates.
You might recall that NPUs usually don’t come with memory of their own, instead sharing system memory. As rapid memory access is one of the two big things AI tasks rely on (the other, naturally, being a sufficient amount of parallel computing capability), this is not ideal. It’s less of a problem for NPUs integrated into SoCs, as these can share the same high-speed memory channels used by the CPU but becomes more visible with module-based setups since all of their memory access has to pass through a bottleneck — the very interface they’re using to connect to the rest of the system.
In the case of the Raspberry Pi AI Kit (and the Raspberry Pi 5 in general), that’s a single PCIe lane, and while there’s a significant gain in performance when it’s running at PCIe 3.0 speeds as opposed to its native PCIe 2.0, neither configuration comes close to the speed of the internal memory bus of the Raspberry Pi 5 — and let alone the speed of GDDR5 and HBM RAM often used in GPUs. Worse yet, both the Hailo-8L and the Hailo-8 natively offer faster interfaces — both versions exposing two PCIe 3.0 lanes, with the latter also offering a M.2 M Key configuration with four lanes (and incidentally, that’s the version we have). Thus, depending on the exact module, you’ll only be utilizing somewhere between a half and a fourth of its available bandwidth. Yes, how much the memory bandwidth affects performance depends on the exact AI model you’re running, but the Raspberry Pi’s PCIe performance could be a bottleneck more often than not.
To illustrate just how much of a difference memory bandwidth can make, let’s run the same YOLOv8 benchmarks from before, this time with the Raspberry Pi 5 configured to use PCIe 2.0 speeds. 500 MB/s just isn’t enough memory throughput for the model in question, and the very same Hailo-8L module that achieved some impressive results a second ago suddenly had its scores cut in half — delivering just 44.75 FPS on the object detection benchmark, and 34.16 FPS on the pose estimation one. It’s clear that memory performance is the bottleneck here, which made us wonder whether our initial benchmarks were also capped by the Raspberry Pi’s PCIe performance and not the Hailo-8L itself.
This all made us doubt a little whether the Hailo-8 would even make a difference on a Raspberry Pi 5, or at least whether we’d see anything remotely close to the performance doubling the specs themselves would suggest.
Without further ado, let’s get the Hailo-8 mounted in place. This module can be purchased as part of a $179 starter kit that includes the module itself, a thermal management kit and a screwdriver. With the provided heatsink in place, it was time to mount it (now, it would have been possible to instead reuse the thermal solution we already had in place for the Hailo-8L, but we decided against it as to minimize the chances of any thermal throttling).
A quick reboot later, everything was ready. Another check with hailortcli confirmed that the software recognized the Hailo-8 module properly. We started the benchmark and…
…well, that’s disappointing — it spat out the same results as before with the Hailo-8L. Perhaps our hunch that PCIe was the bottleneck all along was right? Either way, the only thing left to do was to check the logs — and there it was: a warning that the c files used in the benchmark were compiled for the Hailo-8L but a Hailo-8 has been detected in the system, and that this might lead to limited performance.
What on earth is a HEF file, we hear you ask. It’s Hailo’s custom file format (and by the way, HEF is stands for Hailo Executable Format) which holds models optimized and compiled by their Dataflow Compiler. HEF files are ready to run on all of Hailo’s accelerators but have to be individually compiled (and optimized) for each one.
The work seemed cut out for us — we had to recompile the provided models to be able to utilize the full extent of the Hailo-8’s capabilities. This was a somewhat involved procedure (that we might create a follow-up article on), but once done, it was a matter of replacing the original HEF files that shipped with the benchmark with the ones we’ve just made.
Voila! The next run provided two new values: 126.9 FPS for the YOLOv8 object detection benchmark and 109.9 FPS for the pose estimation one. Not quite double but still solid.
It’s likely that we’re seeing the PCIe bottleneck in full effect here, but we’ll be the first to admit that it could also be due to our limited experience with the Dataflow Compiler. Take these numbers with a grain of salt (or two). After all, they aren’t meant to represent the performance difference between the Hailo-8L and Hailo-8 in ideal conditions, but instead to showcase the gains achievable on a Raspberry Pi 5 specifically — with us poking into the inner workings of the benchmark to even make it run.
Now, should you eye a Hailo-8 to upgrade the Hailo-8L that the Raspberry Pi AI Kit ships with? Maybe — 13 extra TOPS can go a long way, but it all depends on what you need that power for. At $179, you could get two entire extra Raspberry Pi AI Kits and still have $39 left to spare. You’ll still be limited by PCIe bandwidth on the Raspberry Pi 5 itself and it’s likely that systems with more robust PCIe connectivity options would benefit more from this upgraded module, but if your model just can’t reliably run on the lesser module and you like the Raspberry Pi/Hailo ecosystem, it might just make sense to upgrade.
Alternatives
The Raspberry Pi AI Kit is far from the only product offering AI at the edge. Pineboards offers a 90€ (that’s ~$98) AI Bundle which comes with the same Hailo-8L module (which lets it tap into all of the software made with the first-party AI Kit in mind) but packs a custom dual-M.2 board which enables using an NVMe drive and the Hailo module at the same time. By mounting this board underneath the Raspberry Pi 5 instead of going the HAT+ route, GPIO pins remain free.
However, there’s a big drawback — no matter how clever a bit of hardware is, you can’t just make PCIe lanes magically appear. The Pineboards AI Bundle uses a PCIe switch to split one PCIe 3.0 lane into two PCIe 2.0 lanes, one per M.2 slot. We’ve already covered the performance drop this causes, so pick with caution.
At this point, it’s not quite worth looking at Google Coral-based products anymore. They were as good as small edge accelerators got back in 2019, but their age is starting to show. Products like the Hailo-8L blow Coral TPUs out of the water, both when it comes to sheer performance and power efficiency.
If you’re interested in a very different approach to AI on the edge, NVIDIA’s Jetson lineup could prove to be a good pick. The Jetson Orin Nano could be particularly appealing, offering up to 40 TOPS in a 15 W package but is significantly more expensive (and the switch to an entirely different software ecosystem can be daunting for some).
Finally, if you don’t need as much AI power, you could look into a RK3588-based board. We’ve covered quite a few of them (here, here and here, to name some), and while they’re mainly popular because of their fast octa-core CPU and large amounts of RAM on offer, there’s also a 6 TOPS NPU embedded right in the SoC. Software support for it is still flaky, but we’ve seen usable projects get made.
Verdict
The Raspberry Pi AI Kit offers a whole lot of value for $70. We can’t overstate this fact — it takes your Raspberry Pi from being able to run but a few of the simplest AI models and turns it into a mean little inferencing machine. A very power-efficient one at that.
Software support is also quite decent, although there’s a bit of a learning curve once you’re done playing with rpicam-apps. We expect that this will change as time goes on and as the Hailo-8L gets more and more tightly integrated into the Raspberry Pi OS itself — but for now, get ready to spend some extra time with the kit.
If you’re already invested in the Raspberry Pi ecosystem and you’re interested in tinkering with AI (or, you know, you’re developing an AI-powered edge device), the Raspberry Pi AI Kit simply makes a lot of sense and is a good way to ensure you’ll be getting the best possible performance out of your Hailo-8L module.
Just, please, don’t forget to add a longer stacking header to your shopping cart.
- Raspberry Pi AI Camera review: Even more approachable AI - 10/22/2024
- Arturia AstroLab review - 09/29/2024
- Raspberry Pi AI Kit review - 08/16/2024