![]() |
submitted by /u/ep_es_ [visit reddit] [comments] |
Category: Misc
Image segmentation and recommender system Jupyter notebooks are now available in the NGC catalog. These Jupyter notebooks come with complete instructions on how to train these models using the resources from the NGC catalog.
Image segmentation and recommender system Jupyter notebooks are now available in the NGC catalog. These Jupyter notebooks come with complete instructions on how to train these models using the resources from the NGC catalog.
Upcoming Webinars
The NVIDIA NGC team is hosting two webinars with live Q&A to dive into two new Jupyter notebooks available from the NGC catalog. Learn how to use these resources to kickstart your AI journey.
NVIDIA NGC Jupyter Notebook Day: Image Segmentation
February 18 at 9 a.m. PT
Image segmentation deals with placing each pixel of an image into specific classes that share common characteristics.
In this session, you’ll learn:
- How to use a Jupyter notebook containing a pre-trained image segmentation model that can be used to detect defective parts in an industrial application
- How to refine the model by retraining the model using your own hyperparameters and test it using your own checkpoints
NVIDIA NGC Jupyter Notebook Day: Recommender System
February 18 at 11 a.m. PT
Recommender systems deal with predicting user preferences for products based on historical behavior or actions and are widely used in online retail, social media, streaming video, music platforms, and more.
In this session, you’ll learn:
- How to leverage a Jupyter notebook containing a pre-trained recommender system model that can be used to recommend a movie based on a user’s viewing history
- How to refine the model by retraining the model using your own hyperparameters and test it using your own checkpoints
Constrain outputs in a regression problem
Hi, everyone.
I am attempting to constrain some outputs of my regression
network, say x, y, z = model(data), where x, y, z are scalars. The
constrain that I want to impose is that when predicting all three
dependent variables, the condition “x + y <=1.0” must be
honored. Given this description, can I implement this in a forward
function?
Thank you!
submitted by /u/ncuxomun
[visit reddit]
[comments]
Tool for Complex Data Labelling Tasks
Hi /r/tensorflow
readers!
We have created a labelling
tool that can be customized to display all sorts of data models
and tasks. Here are a couple of examples for NLP
and CV.
I hope some of you will find this useful, and if you have any
thoughts I would love to hear your feedback!
submitted by /u/bernatfp
[visit reddit]
[comments]
The project, which runs on a NVIDIA Jetson Nano 2GB Developer Kit, monitors the eyes of the user and voices a prompt when their blink rate is less than the recommended rate of 10 blinks per minute.
Thirteen-year-old Adrit Rao, was awarded the Jetson Project of the Month for his Blink Detection and Reminder (Blinkr). The project, which runs on a NVIDIA Jetson Nano 2GB Developer Kit, monitors the eyes of the user and voices a prompt when their blink rate is less than the recommended rate of 10 blinks per minute.
Several studies have shown that low eye blinking rate, usually triggered by the use of a computer screen, is the leading cause of computer vision syndrome and other related disorders. To address this problem, Adrit created Blinkr with a simple setup of Jetson Nano 2GB Developer Kit, a webcam (or a Raspberry Pi v2 camera), a speaker and a few other basic peripherals.
The camera monitors the face of the user and feeds the frames to the Jetson Nano. To detect blinking, Adrit uses a 68 point facial landmark pre-trained model available in the Dlib open source library. Eyes are detected in each frame and the eye aspect ratio (EAR) is calculated and used to record the number of blinks over time. When the total blinks in a minute is less than the recommended rate, the speaker voices an alarm urging the user to blink more.
Many of us working from home do not have the usual prompts or interruptions during our day to move away from our screens. Tools like Blinkr can help us adopt healthy screen habits. This is a great project to build at home to learn about Jetson and AI, and to protect your eyesight.
This project earned Adrit his Jetson AI Specialist certificate. We are keeping our appreciative (and healthy) eyes peeled out to see what he builds next. If you’re interested in building your own Blinkr, he has shared the instructions and the code here.
NVIDIA Omniverse is bringing the new standard in real-time graphics for developers. Check out some of the resources on the NVIDIA On-Demand catalog to learn more tips and tricks for developing in Omniverse.
NVIDIA Omniverse is bringing the new standard in real-time graphics for developers. Teams across industries are now using the open, cloud-native platform to deliver new levels of virtual collaboration and photorealistic simulation to their projects. And with open beta availability recently announced, more developers around the world can experience Omniverse and explore ways to integrate technologies or connect applications.
Check out some of the resources on the NVIDIA On-Demand catalog to learn more tips and tricks for developing in Omniverse:
Getting Started with Omniverse Launcher: Learn more about the Omniverse Launcher as this session covers installation and configuration, as well as an overview of how to install applications and connectors.
Omniverse Create Overview: Learn how Omniverse Create accelerates advanced scene composition and allows users to assemble, light, simulate, and render complex USD scenes in real time.
Omniverse View Overview:This session is an introduction to Omniverse View, an application created specifically for architecture, engineering, and design professionals.
What Makes USD Unique: USD is the backbone of the Omniverse collaboration technology; in this video we discuss Pixar’s USD file format, explains the basics of its structure, and introduces layers, references and sublayers.
Omniverse Five Things to Know About Materials: This talk shows users where to find and how to interact with materials in Omniverse Create, how to create and import your own MDL materials, and how to convert materials into Omniverse.
Intro to Omniverse Unreal Engine 4 Connector: Get a brief introduction into the Omniverse Unreal Engine 4 (UE4) Connector, which consists of two plugins — a USD and an MDL plugin. This connector lets creators live link Omniverse Applications (like View and Create) with UE4.
Deep Dive into Omniverse Kit: Get an introduction to Omniverse Kit and learn how developers can leverage this powerful toolkit to create new Omniverse Apps and extensions.
Download Omniverse today and check out other Omniverse sessions on the NVIDIA On-Demand portal.
Designers, engineers, researchers, creative professionals all need the flexibility to run complex workflows – no matter where they’re working from. With the newest release of NVIDIA virtual GPU (vGPU) technology, enterprises can provide their employees with more power and flexibility through GPU-accelerated virtual machines from the data center or cloud. Available now, the latest version Read article >
The post NVIDIA Expands vGPU Software to Accelerate Workstations, AI Compute Workloads appeared first on The Official NVIDIA Blog.
When it comes to autonomous vehicle sensor innovation, it’s best to keep an open mind — and an open development platform. That’s why NVIDIA DRIVE is the chosen platform on which the majority of these sensors run. In addition to camera sensors, NVIDIA has long recognized that lidar is a crucial component to an autonomous Read article >
The post A Sense of Responsibility: Lidar Sensor Makers Build on NVIDIA DRIVE appeared first on The Official NVIDIA Blog.
NVIDIA Announces Nsight Graphics 2021.1
Nsight Graphics 2021.1 is available to download – check out this article to see what’s new.
Nsight Graphics 2021.1 is available to download.
We now provide you with the ability to set any key to be the capture shortcut. This new keybinding is supported for all activities, including GPU Trace. F11 is the default binding for both capture and trace, but if you prefer the old behavior, the original capture keybinding is still supported (when the ‘Frame Capture (Target) > Legacy Capture Chord’ setting is set to Yes).
You can now profile applications which use D3D12 or Vulkan strictly for compute tasks using the new ‘One-shot’ option in GPU Trace. Tools that generate normal maps or use DirectML for image upscaling can now be properly profiled and optimized. To enable this, set the ‘Capture Type’ to ‘One-shot [Beta]’
While TraceRays/DispatchRays has been the common way to initiate ray generation, it’s now possible to ray trace directly from your compute shaders using DXR1.1 and the new Khronos Vulkan Ray Tracing extension. In order to support this new approach, we’ve added links to the acceleration structure data for applications that use RayQuery calls in compute shaders.
It’s important to know how much GPU Memory you’re using and to keep this as low as possible in Ray Tracing applications. We’re now making this even easier for you by adding size information to the Acceleration Structure Viewer.
Finally, we’ve added the Nsight HUD to Windows Vulkan applications in all frame debugging capture states. Previously the HUD was only activated once an application was captured.
We’re always looking to improve our HUD so please make sure to give us any feedback you might have.
For more details on Nsight Graphics 2021.1, check out the release notes (link).
We want to hear from you! Please continue to use the integrated feedback button that lets you send comments, feature requests, and bugs directly to us with the click of a button. You can send feedback anonymously or provide an email so we can follow up with you about your feedback. Just click on the little speech bubble at the top right of the window.
Try out the latest version of Nsight Graphics today!
Khronos released the final Vulkan Ray Tracing extensions today. NVIDIA Vulkan beta drivers available for download. Welcome to the era of portable, cross-vendor, cross-platform ray tracing acceleration!
And be sure to check out the final Vulkan Ray Tracing extensions from the Khronos Group as well!
AI, the most powerful technology of our time, demands a new generation of computers tuned and tested to drive it forward. Starting today, data centers can get boot up a new class of accelerated servers from our partners to power their journey into AI and data analytics. Top system makers are delivering the first wave Read article >
The post Certifiably Fast: Top OEMs Debut World’s First NVIDIA-Certified Systems Built to Crush AI Workloads appeared first on The Official NVIDIA Blog.