## PyTorch Tensor to NumPy Array and Back

2021-03-22You can easily convert a NumPy array to a PyTorch tensor and a PyTorch tensor to a NumPy array. This post explains how it works.

Skip to content# pytorch

## PyTorch Tensor to NumPy Array and Back

2021-03-22## TorchVision Transforms: Image Preprocessing in PyTorch

2021-03-19## PyTorch One Hot Encoding

2021-02-02## The PyTorch Softmax Function

2021-01-29## Normalizing Images in PyTorch

2021-01-15## Object Tracking in 75 Lines of Code

2020-08-01## Cross Entropy Loss in PyTorch

Posted 2020-07-24 • Updated 2021-01-28## Adding a Dimension to a Tensor in PyTorch

Posted 2017-03-09 • Updated 2020-01-02## PyTorch Quick Start: Classifying an Image

Posted 2017-02-25 • Updated 2019-02-28

You can easily convert a NumPy array to a PyTorch tensor and a PyTorch tensor to a NumPy array. This post explains how it works.

TorchVision, a PyTorch computer vision package, has a great API for image pre-processing in its torchvision.transforms module. This post gives some basic usage examples, describes the API and shows you how to create and use custom image transforms.

PyTorch has a one_hot() function for converting class indices to one-hot encoded targets.

You can use the top-level torch.softmax() function from PyTorch for your softmax activation needs.

You can use the torchvision Normalize() transform to subtract the mean and divide by the standard deviation for image tensors in PyTorch. But it’s important to understand how the transform works and how to reverse it.

Object tracking is pretty easy conceptually. And if you have a good detector, simple methods can be pretty effective.

Cross entropy loss in PyTorch can be a little confusing. Here is a simple explanation of how it works for people who get stuck.

Adding a dimension to a tensor can be important when you’re building deep learning models.

In this post we’ll classify an image with PyTorch. If you prefer to skip the prose, you can checkout the Jupyter notebook. Two interesting features of PyTorch are pythonic tensor manipulation that’s similar to numpy and dynamic computational graphs, which handle recurrent neural networks in a more natural way than static computational graphs. A good description …