share article

Share on facebook
Share on twitter
Share on linkedin

Vision-Based System Design Part 4 – Building an Embedded Vision System

News

Handing Over to Software Development

Once the design is verified and addresses assigned to the memory-mapped peripherals which use the high-performance and general-purpose AXI interconnects (this can be done automatically), the hardware is built in Vivado 2015.4 and exported to the SDK 2015.4 software development environment.

The SDK is used to configure not only the design within the Zynq 7020 but also some elements on the EVK, prior to using it. This is because the PS acts as the system supervisor and control function in this example. Hence the entire embedded vision system has to be configured.

Some simple software is needed to get the system up-and-running. In particular, it is used to configure key system elements including configuring the Python 1300C camera via its SPI interface, as well as the AXI Python 1300C interface module, the AXI VDMA to read from and write frames to DDR memory, the AXI Colour Filter Array, the HDMI Output device, the I2C Mux and associated peripherals, and the I2C IO expander controlling the power rails for the Python 1300C.

The EVK uses the Zynq 7020 PS I2C controller to configure the HDMI output device and enable the power supplies to the Python device. Avnet supplies APIs for controlling the I2C and configuring the following:

• ADV7511 – API for the HDMI output

• CAT9554 – API for the I2C I/O expander on the camera module

• TCA9548 – API for the I2C mux on the EVCC

• PCA9534 – API for the I2C IO expander on the EVCC

• OnSemi_Python_SW – API for the Python 1300C

• XAXIVDMA_EXT – API for configuring the VDMA

• XIICPS_EXT – API for driving the external I2C 

In addition to these APIs, Xilinx APIs for the IP within the image-processing chain are also provided.

To create the software application, the hardware design is imported into SDK and a board support package (BSP) created for this hardware. This BSP will contain all of the Xilinx APIs needed when coupled with the Avnet APIs to drive the hardware in the Image-processing chain and the Zynq 7020. 

The software itself is required to initialise the AXI peripherals, power-up the image sensor rails, and configure the Python 1300C, the colour filter array and the VDMA. Some code examples are shown:

• Initialise all of the AXI peripherals

• Power up the image-sensor rails

• Configure the Python 1300C, the colour filter array and the VDMA

Completing these steps when the software is run on the EVK results in an image being output on the HDMI monitor as seen in figure 2.

Conclusion 

Following the articles discussing the elements of an embedded vision system, this demonstration shows how a system is built using software APIs and IP libraries. From this starting point, custom algorithms can then be developed to add further value to the image-processing chain.

Further information can be found at Microzed Chronicles blogs and the AVNET Git HUB.

For more information, please visit: http://www.xilinx.com/products/design-tools/embedded-vision-zone.html

 

 

Share this article

Share on facebook
Share on twitter
Share on linkedin

Related Posts

View Latest Magazine

Subscribe today

Member Login