I am working on two ECDIS projects in parallel. Both have to do with monitoring sea traffic.
As some details of the projects are confidential, I am not currently able to release any details on the projects themselves. But I can talk about my work in those two projects.
Project A:
The goal is to create a system that will help track ships that are involved in, or responsible for different types of events that can happen in the seas. For instance: an oil spill is located and reported somewhere. Authorities need to identify the vessels, suspect for causing the environmental damage.The project approaches the problem by creating a system that will allow for sea traffic monitoring and then, correlating that information with earth observation (EO) data to produce a list of suspect vessels. Vessels are ranked by certain qualitative, quantitative and spatial criteria to help the user make the final decisions and identify the offending vessel using the hints provided by the system along with his/her experience and best knowledge.
What I am building for this project is a custom ECDIS system that implements all the functionality required for this application.
Sea traffic is monitored using a network of AIS receivers. AIS messages are temporarily recorded in local databases and transmitted in real time to a central database through web services over a VPN.
The ECDIS software I am building is using AIS signals to mark the positions of different vessels at specific points in time. It also provides a mechanism for importing and recoding vector data that come form EO sources (processed satellite imagery). EO data are used to identify event locations and possibly other detected targets in the area of the event during data capture.
The concept of the software developed is described below. I will post some screen-shots or a screen-cast when possible to support the description below:
The application window is divided vertically into two panes. The main pane (on the left) is the map pane where the ENC is displayed. The second narrower pane on the right is used for context sensitive information display. Both panes are tabbed for better information and functionality grouping. Application commands are available through main and context menus and toolbars.
Below the ENC pane there is a set of playback controls much similar to those you see in video players: a Play/Pause button, a timeline you can scroll, a playback speed selector etc.
All data recorded in the system (be them AIS messages, EO data or Events) are tied to a specific point in time and space.
The concept is that while the system's database maintains historical sea traffic data for long periods of time, the user only needs to focus on a specific subset of those data related to a particular event. To enable this approach the application allows the user to select one of the recorded events to focus on. Focusing on an event implicitly means filtering AIS and EO data to a specific point in time and space. More accurately: around a specific point in time and space. This way the system loads only relevant data from the database which makes processing faster and consumes less system resources. Selection of the timeframe and area is made either implicitly and explicitly by the user during the selection of the event under investigation.
So keep in mind two concepts here:
- the "investigated time-frame" which is essentially a period determined by a staring date/time and a length (duration)
- and the "investigated area" determined by the central location of the event and a range in nautical miles around it.
- the "focus time"
- the length of the "visible time-frame"
- and the view-port (which is identified by its center coordinates and range)
The length of the "visible time-frame" refers to the time span before focus time during which all recorded signals should be visualized. For instance if the visible time-frame is set to 1 hour then the track behind a vessel's "current" position will be displayed for the last hour ("current" being determined by "focus-time").
The "view-port" is nothing more that the visible area of the ENC and is defined by means of panning and zooming the map.
Probably you are already getting the "big picture": A system that will playback what happened around an event (i.e. an oil spill) and allow you to watch it like you would watch a video. What you see is what you would see if you were flying with a plane over the event at the selected time, only mapped in ENC, loaded with useful information and of course: interactive.
During playback, you can move around the map by zooming and panning, change the playback speed and generally interact fully with the application (all functionality remains available).
By pointing your mouse to a vessel's latest signal or track you can see relevant information on the right pane of the application. Information available for each vessel includes all category 5 AIS message fields (ship static and voyage information), all fields of categories 1, 2 & 3 of AIS messages (position reports) and derived information based on algorithms and ranking databases, that help classify the ship and rank the probability of it being the offending vessel.
At the time of writing this project is in its final stages. It has been demonstrated to the customer and given their satisfaction it is pending some further development and optimizations before it is officially presented and delivered.
Project B:
Project B is an entirely independent project from Project A. Nevertheless, it is so relevant in context, that it is being developed in parallel. Actually so far I did not see a need to even branch the first project. Minor behavioral differences are handled very effectively through configuration files.The goal of this project is to use AIS, VTS radar and EO data to identify certain types of vessels in the context of naval security.
AIS, VTS and EO data are correlated with data fusion algorithms and the results are again ranked by risk level.
My work in this project involves the creation of the visualization console. Data acquisition and fusion is handled by other project parties and the results of their work is just input data for my application.
The main difference from Project A is that this time, the software must be used mainly for near real-time monitoring. Playback is just a useful feature.
This project uses more sophisticated data sets and also includes estimated data. There are also considerable differences in data formats. All these had being handled properly during the design phase of the software and provisions where made so that it can read a wider variety of different data sources.
This project is also approaching its demonstration phase.
Technologies used in both projects include:
- Microsoft SQL 2005 Server
- PostgreSQL (only in Project B)
- Microsoft .NET Framework 2.0
- SevenCs EC2007 ECDIS SDK
Both systems are being developed using Microsoft Visual Studio 2005 and C#.