Page tree

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 7 Next »

To get started with visage|SDK, get in touch with us by filling out the contact form on our website. Your application will be carefully reviewed in order to provide the most relevant information. You will then be contacted by your Account Manager who’ll be happy to discuss your business needs further, share a link to our download center, and provide a free evaluation license for you if needed.

Visage|SDK has the widest range of platform availability in the market, including mobile and embedded systems. Simply pick the package(s) for the platform(s) of your interest, download them and unpack.

Your free evaluation license lets you try out our technology yourself. You are free to explore its functionalities and discover how they can complement your own business. Once you’re ready to upgrade your license, let your Account Manager know. They will send you the optimal quote and provide you with licenses.

(blue star) Demo applications

Visage|SDK contains a fully documented API and ready-to-build sample projects with full source code. This gives you a quick start in developing your applications using our technology.

Where possible, the sample applications are also provided as pre-built, ready-to-use demo applications. To start, please select the platform below for a list of all demo applications and instructions on how to test them:

After trying the demo applications, you might want to start developing your own. The documentation, available in Documentation.html in the top folder of each visage|SDK installation, provides detailed reference to our powerful API as well as further details on the above described sample projects.

Demo applications on Windows

(blue star) Face and Facial Features Tracking

To test face and facial features tracking from camera in visage|SDK for Windows, please run the application DEMO_ShowcaseDemo.exe from the visageSDK/bin folder (visageSDK/bin64 on Windows 64-bit).

In the menu on the right you can choose to perform face tracking from video file or in still image and choose different display options.

For full instructions, please refer to the documentation (visageSDK/Documentation.html), more specifically to section Samples → C# → Showcase Demo.

(blue star) Facial Features Detection

To test facial features detection in visage|SDK for Windows, please run the application DEMO_ShowcaseDemo.exe from the visageSDK/bin folder (visageSDK/bin64/ on Windows 64-bit).

In the menu on the right you can choose to switch to face detector and you will be required to choose a file specifying the image to process.

For full instructions, please refer to the documentation (visageSDK/Documentation.html), more specifically to section Samples → C# → Showcase Demo.

(blue star) Screen Space Gaze Tracking

To test screen space gaze tracking in visage|SDK for Windows, please run the application DEMO_GazeTracker.exe from the visageSDK/bin folder (visageSDK/bin64/DEMO_GazeTracker_x64.exe on Windows 64-bit). In the first (calibration) phase, you are required to repeatedly click on red dots that are displayed. After that, estimated gaze location is being drawn as a blue dot on the screen.

For full instructions, please refer to the documentation (visageSDK/Documentation.html), more specifically to section Samples →  C++ → Gaze Tracker.

(blue star) Face Analysis (age, gender and emotion estimation)

To test age, gender or emotion estimation in visage|SDK for Windows, please run the application DEMO_ShowcaseDemo.exe from the visageSDK/bin folder (visageSDK/bin64/ on Windows 64-bit).

In the menu on the right, you can enable age, gender and emotion estimation while tracking from camera or video file by switching to tracking from video.

For full instructions, please refer to the documentation (visageSDK/Documentation.html), more specifically to section Samples → C# → Showcase Demo.

Demo applications on Android

(blue star) Face and Facial Features Tracking

To test face/facial features tracking in visage|SDK for Android, please go to visageSDK-Android/bin folder, copy ShowcaseDemo-release.apk to your Android device and install it. After you have opened the installed application, tracking from camera will start.

In the menu on top you can choose to perform face tracking in still image or video and choose different display options.

For full instructions, please refer to the documentation (visageSDK-Android/Documentation.html), more specifically to section Samples → ShowcaseDemo.

(blue star) Facial Features Detection

To test facial features detection in visage|SDK for Android, please go to the visageSDK-Android/bin folder, copy FaceDetectDemo-release.apk to your Android device and install it. After you have opened the installed application, you will be required to choose an image on which you want to perform face detection.

For full instructions, please refer to the documentation (visageSDK-Android/Documentation.html), more specifically to section Samples →  Face Detect Demo.

(blue star) Screen Space Gaze Tracking

Please note that at the moment there is no ready-to-run demo application for screen space gaze tracking on Android, although the functionality is available in the API (see API → Facial Features Tracking → VisageGazeTracker section in the documentation). For a quick test of the gaze tracking feature in visage|SDK, you may want to try the Gaze Tracker sample application in visage|SDK for Windows platform or simply use the online demo.

(blue star) Face Analysis (age, gender and emotion estimation)

To test age, gender or emotion estimation in visage|SDK for Android, please go to visageSDK-Android/bin folder, copy ShowcaseDemo-release.apk to your Android device and install it. After you have opened the installed application, tracking from camera will start.

In the menu on top, you can enable age, gender and emotion estimation while tracking from camera.

For full instructions, please refer to the documentation (visageSDK-Android/Documentation.html), more specifically to section Samples →  ShowcaseDemo.

Demo applications on iOS

(blue star) Face and Facial Features Tracking

To test face/facial features tracking in visage|SDK for iOS, please try ShowcaseDemo application which is available on Test Flight via https://testflight.apple.com/join/dCJLerW4.

After running the application, tracking from camera will start. In the menu on top you can choose to perform face tracking in still image or video and choose different display options.

For full instructions on how to build and run the application, please refer to the documentation (visageSDK-iOS/Documentation.html), more specifically to section Samples → ShowcaseDemo.

(blue star) Facial Features Detection

To test facial features detection in visage|SDK for iOS, you will need a valid license key file. For more information please contact your sales representative.
Facial features detection is demonstrated in FaceDetectDemo sample application.  Information on how to include license key file in the application can be found in the Licensing section of the documentation.

Alternatively, you can test facial features detection in the online demo. You will be required to allow the application to access your camera via browser's pop up in order to start tracking on camera frames. You can switch to detector by clicking on the option in the upper right corner “Switch to Detector”.

For full instructions, please refer to the documentation (visageSDK-iOS/Documentation.html), more specifically to section Samples → Face Detect Demo.

(blue star) Screen Space Gaze Tracking

Please note that at the moment there is no ready-to-run demo application for screen space gaze tracking on iOS, although the functionality is available in the API (see API → Class Reference → VisageGazeTracker section in the documentation). For a quick test of the gaze tracking feature in visage|SDK, you may want to try the Gaze Tracker sample application in visage|SDK for Windows platform or simply use the online demo.

(blue star) Face Analysis (age, gender and emotion estimation)

To test age, gender and emotion estimation in visage|SDK for iOS, please try ShowcaseDemo application which is available on Test Flight.

After running the application, tracking from camera will start. In the menu on top, you can enable age, gender and emotion estimation while tracking from camera.

For full instructions on how to build and run the application, please refer to the documentation (visageSDK-iOS/Documentation.html), more specifically to section Samples → Showcase  Demo.

Demo applications on HTML5

(blue star) Face and Facial Features Tracking

To test face/facial features tracking in visage|SDK for HTML5, please try the online demonstration. You will be required to allow the application to access your camera via browser's pop up in order to start tracking. In the menu on the right you can choose different display options. 

For full instructions on how to run and test the sample application on a web server, please refer to the documentation (visageSDK-HTML5/Documentation.html), more specifically to section Samples →  Showcase Demo.

(blue star) Facial Features Detection

To test face/facial features detection in visage|SDK for HTML5, please try the demonstration online. You will be required to allow the application to access your camera via browser's pop up in order to start tracking on camera frames. You can switch to detector by clicking on the option in the upper right corner “Switch to Detector”.

For full instructions on how to run the sample application on a web server, please refer to the documentation (visageSDK-HTML5/Documentation.html), more specifically to section Samples →  Showcase Demo.

(blue star) Screen Space Gaze Tracking

To test gaze tracking in visage|SDK for HTML5, please try the demonstration online. You will be required to allow the application to access your camera via browser's pop up in order to start gaze tracking. In the first (calibration) phase, you are required to follow the red dot that is displayed on the screen. After that, estimated gaze location is being drawn as a blue dot on the screen.

For full instructions on how to run and test the sample application on a web server, please refer to the documentation (visageSDK-HTML5/Documentation.html), more specifically to section Samples → Gaze Tracker.

(blue star) Face Analysis (age, gender and emotion estimation)

To test age, gender and emotion estimation in visage|SDK for HTML5, please try the demonstration online. You will be required to allow the application to access your camera via browser's pop up in order to start tracking and performing face analysis.

For full instructions on how to run and test the sample application on a web server, please refer to the documentation (visageSDK-HTML5/Documentation.html), more specifically to section Samples →  Showcase Demo.

Demo applications on MacOS

(blue star) Face and Facial Features Tracking

To test face/facial features tracking in visage|SDK for Mac, please navigate to the visageSDK-MacOS/bin folder and open the application by double clicking on the VisageTrackerDemo.app file. Start tracking by selecting an option from "Tracking" menu.

You can track from camera, video files or from still images by selecting an appropriate option on the “Tracking” menu. Sample videos are available in visageSDK-MacOS/data/video.

For full instructions, please refer to the documentation (visageSDK-MacOS/Documentation.html), more specifically to section Samples → VisageTrackerDemo.

(blue star) Facial Features Detection

To test facial features detection in visage|SDK for Mac, please navigate to the visageSDK-MacOS/bin folder and open the application by double clicking on the Face Detector.app file. After running the application, go to File → Open and choose image.

For full instructions, please refer to the documentation (visageSDK-MacOS/Documentation.html), more specifically to section Samples → Face Detector.

(blue star) Screen Space Gaze Tracking

Please note that at the moment there is no ready-to-run demo application for screen space gaze tracking on Mac, although the functionality is available in the API.
For a quick test of the gaze tracking feature in visage|SDK, you might want to try the Gaze Tracker sample application in visage|SDK for Windows platform or simply use the online demo.

 

(blue star) Face Analysis (age, gender and emotion estimation)

Please note that at the moment there is no ready-to-run prebuilt demo application for face analysis (age, emotion and gender estimation) on Mac.
VisageTrackerUnity sample application demonstrates face analysis, but it needs to be built from Unity package. More instructions on how this can be done can be found in the documentation. Please note that you will also need a valid license key file so please contact your sales representative.
For a quick test of the face analysis feature in visage|SDK, you may want to try the online demo.

Demo applications on RedHat

(blue star) Face and Facial Features Tracking

To test face/facial features tracking in visage|SDK for RedHat, please navigate to the visageSDK-RedHat/bin folder. Start the application by running the ./runTracker.sh command from the terminal. You will be able to choose whether to track from camera or from a video (default video file will be used).

For full instructions, please refer to the documentation (visageSDK-RedHat/Documentation.html), more specifically to section Samples → VisageTrackerDemo.

(blue star) Facial Features Detection

To test facial features detection in visage|SDK for RedHat, please navigate to the visageSDK-RedHat/bin folder. Start the application by running the ./runDetector.sh <filename> command from the terminal with <filename>  being the absolute or relative path to an image file. Sample images are available in Samples/Linux/data/images.

For full instructions, please refer to the documentation (visageSDK-RedHat/Documentation.html), more specifically to section Samples → VisageDetectorDemo.

(blue star) Screen Space Gaze Tracking

Please note that at the moment there is no ready-to-run demo application for screen space gaze tracking on RedHat, although the functionality is available in the API (see API → Class Reference → VisageGazeTracker section in the documentation).

For a quick test of the gaze tracking feature in visage|SDK, you may want to try the Gaze Tracker sample application in visage|SDK for Windows platform or simply use the online demo.

 

(blue star) Face Analysis (gender and emotion estimation)

Please note that at the moment there is no ready-to-run demo application for face analysis (age, emotion and gender estimation) on RedHat, although the functionality is available in the API (see API → Facial Features Analysis → VisageFaceAnalyser section in the documentation).

For a quick test of the face analysis feature in visage|SDK, you may want to try out the online demo. You will be required to allow the application to access your camera via browser's pop up in order to start tracking and performing face analysis.

Demo applications on Linux

(blue star) Face and Facial Features Tracking

To test face/facial features tracking in visage|SDK for Linux, please navigate to the visageSDK-Linux/bin folder. Start the application by running the ./VisageTrackerDemo command from the terminal. You will be able to choose whether to track from camera or from a video (default video file will be used).

For full instructions, please refer to the documentation (visageSDK-Linux/Documentation.html), more specifically to section Samples → VisageTrackerDemo.

(blue star) Facial Features Detection

To test facial features detection in visage|SDK for Linux, please navigate to the visageSDK-Linux/bin folder. Start the application by running the ./VisageDetectorDemo <filename> command from the terminal with <filename> being the absolute or relative path to an image file. Sample images are available in Samples/Linux/data/images.

For full instructions, please refer to the documentation (visageSDK-Linux/Documentation.html), more specifically to section Samples → Visage Detector Demo.

(blue star) Screen Space Gaze Tracking

Please note that at the moment there is no ready-to-run demo application for screen space gaze tracking on Linux, although the functionality is available in the API (see API → Class Reference → VisageGazeTracker section in the documentation).

For a quick test of the gaze tracking feature in visage|SDK, you may want to try the Gaze Tracker sample application in visage|SDK for Windows platform or simply use the online demo.

 

(blue star) Face Analysis (gender and emotion estimation)

Please note that at the moment there is no ready-to-run demo application for face analysis (age, emotion and gender estimation) on Linux, although the functionality is available in the API (see API → Facial Features Analysis → VisageFaceAnalyser section in the documentation).

For a quick test of the face analysis feature in visage|SDK, you may want to try the online demo. You will be required to allow the application to access your camera via browser's pop up in order to start tracking and performing face analysis.

Demo applications in Unity 3D

There are two sample applications in visage|SDK that demonstrate integration of visage|SDK with Unity game engine and are aimed at developers starting to use face tracking, face analysis and face recognition functionalities of visage|SDK in Unity game engine and development tool. Both sample applications are provided in visage|SDK as ready-to-build Unity projects for Windows, Android, Mac and iOS packages. HTML5 package provides only one Unity sample application.

(blue star) visage|SDK for Windows

VisageTrackerUnity

VisageTrackerUnityDemo sample application demonstrates how face tracking is used to put a virtual object (glasses) on face. It also demonstrates age, gender and emotion estimation, and face recognition.

Application should be built from the provided Unity project (visageSDK\Samples\Unity\VisageTrackerUnityDemo).  You will also need a valid license key file so please contact your sales representative.

Alternatively, you can try out the online demo here.

For full instructions, please refer to the documentation (visageSDK/Documentation.html), more specifically to section Samples →  Unity3D → Visage Tracker Unity.

FacialAnimationUnity

FacialAnimationUnityDemo sample application demonstrates how face tracking is used to animate a virtual character and make it mimic the user's facial expressions.

Application should be built from provided Unity project (visageSDK\Samples\Unity\FacialAnimationUnityDemo). You will also need a valid license key file so please contact your sales representative.

For full instructions, please refer to the documentation (visageSDK/Documentation.html), more specifically to section Samples →  Unity3D → Facial Animation Unity.

(blue star) visage|SDK for Android

VisageTrackerUnity

VisageTrackerUnityDemo sample application demonstrates how face tracking is used to put a virtual object (glasses) on face, as well as age, gender, emotion estimation and face recognition.

Application should be built from the provided Unity project
(visageSDK-Android\Samples\Android\VisageTrackerUnityDemo). You will also need a valid license key file so please contact your sales representative.

Alternatively, you can try out the online demo here.

For full instructions, please refer to the documentation (visageSDK-Android/Documentation.html), more specifically to section Samples →  Unity3D → Visage Tracker Unity Demo.

FacialAnimationUnity

FacialAnimationUnityDemo sample application demonstrates how face tracking is used to animate a virtual character and make it mimic the user's facial expressions.

Application should be built from the provided Unity project
(visageSDK-Android\Samples\Android\FacialAnimationUnityDemo). You will also need a valid license key file so please contact your sales representative.

For full instructions, please refer to the documentation (visageSDK-Android/Documentation.html), more specifically to section Samples →  Unity3D → Facial Animation Unity Demo.

(blue star) visage|SDK for iOS

VisageTrackerUnity

VisageTrackerUnityDemo sample application demonstrates how face tracking is used to put a virtual object (glasses) on face, as well as age, gender, emotion estimation and face recognition.

To build and run the application, two steps are involved: generating and modifying the Xcode project from the provided Unity project (visageSDK-iOS/Samples/iOS/VisageTrackerUnityDemo) and building the application from Xcode. You will also need a valid license key file so please contact your sales representative.

Alternatively, you can try out the online demo here.

For full instructions, please refer to the documentation (visageSDK-iOS/Documentation.html), more specifically to section Samples →  Unity3D → Visage Tracker Unity Demo.

FacialAnimationUnity

FacialAnimationUnityDemo sample application demonstrates how face tracking is used to animate virtual character and make it mimic the user's facial expressions.

To build and run the application, two steps are involved: generating and modifying the Xcode project from provided Unity project (visageSDK-iOS/Samples/iOS/FacialAnimationUnityDemo) and building the application from Xcode. You will also need a valid license key file so please contact your Account Manager.

For full instructions, please refer to the documentation (visageSDK-iOS/Documentation.html), more specifically to section Samples →  Unity3D → Facial Animation Unity Demo.

(blue star) visage|SDK for Mac OS

VisageTrackerUnity

VisageTrackerUnityDemo sample application demonstrates how face tracking is used to put a virtual object (glasses) on face, as well as age, gender, emotion estimation and face recognition.

To build and run the application, two steps are involved: generating and modifying the Xcode project from provided Unity project (visageSDK-MacOS/Samples/MacOSX/VisageTrackerUnityDemo) and building the application from Xcode. Alternatively, you can preview the application in Unity Editor.  You will also need a valid license key file so please contact your sales representative.

Alternatively, you can try out the online demo.

For full instructions, please refer to the documentation (visageSDK- MacOS/Documentation.html), more specifically to section Samples →  Unity3D → Visage Tracker Unity Demo.

FacialAnimationUnity

FacialAnimationUnityDemo sample application demonstrates how face tracking is used to animate virtual character and make it mimic the user's facial expressions.

To build and run the application, two steps are involved: generating and modifying the Xcode project from provided Unity project (visageSDK-MacOS/Samples/MacOSX/FacialAnimationUnityDemo) and building the application from Xcode. Alternatively, you can preview the application in Unity Editor. You will also need a valid license key file so please contact your sales representative.

For full instructions, please refer to the documentation (visageSDK-MacOS/Documentation.html), more specifically to section Samples →  Unity3D → Facial Animation Unity Demo.

(blue star) visage|SDK for HTML5

VisageTrackerUnity

VisageTrackerUnityDemo sample application demonstrates how face tracking is used to put a virtual object (glasses) on face, as well as age, gender and emotion estimation.

To try VisageTrackerUnity sample application on HTML5, please try the online demo.

For full instructions on how to run and test the sample application on a web server, please refer to the documentation (visageSDK-HTML5/Documentation.html), more specifically to section Samples →  Unity3D →  VisageTrackerUnityDemo.

  • No labels