Quantum eMotion
open-icon

ECDSA

link icon
What is ECDSA?

As a first use case of quantum entropy, we describe a simple public key cryptography method called Digital Signatures. Digital signatures are widely used on the internet for authenticating users. Their operation mimics real-life document signing, where security ensures that no one can forge your signatures but anyone can verify that you have signed the physical document. In digital signatures, signing and verification are done mathematically using a private key and a public key for signing and verification respectively. No one can forge your signature because your private key is not shared with anyone else, and anyone can verify your signature because your public key is available to everyone. A question may arise about why and how the public/private key pair would work here! It's because public/private key pairs are integers tied together into a mathematical function. A user first generates a private key—a random 64-bit integer—and uses a mathematical function with the private key as input to get the public key.

In this example, the mathematical function is based on Elliptic curves, where a random private key is multiplied with a parameter of the Elliptic Curve to get the public key such that the verification is correct for the right public/private key pair.

How is it Related to Quantum Entropy?

In current systems, private keys are generated using underlying pseudorandom functions which in turn take seed inputs by recording the parameters of the local device. This private key undergoes some mathematical transformations specified by the cryptography scheme to get the public key. This transformation binds the private key with the public key such that correctness is always guaranteed with a public/private key pair.

The security is guaranteed from the fact that even if the adversary has the public key, he should not be able to decrypt the messages with it. The randomness of private keys plays a crucial role in this definition of security because if the private key generation procedure is predictable, then an adversary can use the same algorithm and the seed to generate the same private keys. That means, he can also decrypt the messages for which he did not generate the private key like a benign user.

Quantum entropy ensures that the private key generation process is unpredictable at any given point in time.

ecdsa-img

We describe the ECDSA algorithm which consists of the following steps:

  • 1It retrieves a random number from QxEAAS API.
  • 2Generates an ECDSA key pair.
  • 3Signs a user-provided message.
  • 4Verifies the signature using the public key.
QRNG Entropy bytes used(KiB) : 32 KiB
icon

Check out the full source code on GitHub

View the source code for ECDSA on our GitHub.

Prerequisites

  • Python 3.6 or later
  • requests library
  • base64 library
  • ecdsa library
iconDEEP DRIVE
ECDSA Setup Guide
icon

Installation

Import necessary libraries

Define API access token & entropy size

Define API request URL

Define and submit the request

Process the response

Generate ECDSA key pair

Sign a message

Verify the signature

RSA

link icon
Why RSA?

RSA is another public key cryptography algorithm used for sending encrypted messages. It is similar to ECDSA in that both utilize public/private key pairs for their cryptographic operations. However, they differ in the underlying mathematical problem that relates public and private keys. RSA utilizes the intractability of the factorization problem: it is easy to compute ( n = p \times q ), where ( p ) and ( q ) are large prime numbers, but it is difficult to factorize ( p ) and ( q ) from ( n ). In RSA, the private keys ( p ) and ( q ) are used to decrypt messages encrypted by the public key ( n ).

Role of ( p ) and ( q )

In RSA encryption, prime numbers ( p ) and ( q ) are the foundation of the encryption process. The security of RSA relies on the difficulty of factoring large numbers. If ( p ) and ( q ) are weak (easily guessed numbers), the encryption becomes easy to break. Hence, they need to be truly random and unpredictable.

Quantum Entropy and RSA

In the current implementations of RSA, the randomness required to generate ( p ) and ( q ) is either taken from a pseudorandom generator or random bytes are extracted from the system's parameters, which makes it prone to side-channel attacks, phishing, or predictability. Any adversary who controls the system can definitely predict the private keys ( p ) and ( q ). QxEaaS's quantum entropy ensures that ( p ) and ( q ) are completely unpredictable prime numbers independent of the system parameters.

We describe a scenario of utilizing QxEaaS to generate private keys in RSA in the example below.

Code Description

Here we demonstrate how to generate cryptographically secure private and public keys for an RSA encryption scheme using quantum random numbers from the Quantom entropy API.

RSA encryption relies on having large prime numbers for its private and public keys. To ensure the primes are truly random and unpredictable, we generate random 64-bit integers from the Quantum entropy source and test them for primality. This guarantees the resulting private keys have maximum entropy and cannot be guessed or predetermined by an adversary.

Once we obtain randomly generated prime numbers ( P ) and ( Q ), we calculate the public key components ( N ) and ( E ), as well as the private key ( D ), following the standard RSA process. We then encrypt a sample message using the public key and decrypt it with the private key.

rsa-img

We describe the RSA algorithm which consists of the following steps:

  • 1Retrieves random numbers from an API,
  • 2Generates two prime numbers (p and q),
  • 3Calculates the RSA variables (n, phi, e, d),
  • 4Encrypts a user-provided message, and then
  • 5Decrypts it using the RSA algorithm.
QRNG bytes used : 1 KiB
icon

Check out the full source code on GitHub

View the source code for RSA on our GitHub

Prerequisites:

  • gmpy2
  • requests
  • base64
  • math
  • sympy
iconDEEP DRIVE
RSA Setup Guide
icon

Installation

Import necessary libraries

Define API access token & entropy size, API request URL, and the request

Get P and Q

Calculate Variables

Encryption

Decryption

Kyber

link icon
What is Kyber?

In the above sections, we introduced two public key cryptography schemes - ECDSA and RSA. In this section we will talk about a similar cryptography scheme called Kyber with the additional quantum secure properties. The emergence of quantum computers brings out major threat in the cryptography setting as the underlying hard problems in the current cryptography schemes like Factorization problem and Discrete Logarithm problem can be solved using Shor'salgorithm on a sufficiently sized quantum bit computer. To counter this threat, researchers proposed to utilize even harder problems in mathematics to build novel cryptography schemes. One of these problems arises from Lattice Theory of mathematics.

Kyber is public key encryption scheme based on Lattice primitives where the hard problem is to find a short vector in a dense collection of lattice vectors spaced uniformly called the Shortest Vector Problem(SVP). This problem is conjectured to be quantum resistant meaning currently there are no quantum algorithms that would be able to solve the shortest vector problem efficiently.

Quantum Entropy and Kyber

As with classical cryptography schemes like RSA and ECDSA, Kyber utilizes public key and private key for encryption and decryption respectively. Although there has been no attacks on Kyber with respect to underlying hard problem but there are side channel attacks on the algorithm aimed to get the private key. These attacks focus on the algorithm that generate seeds as input to the private key generation algorithm. In particular, the system configuration is recorded and the extracted bytes are used as input in a permutation algorithm like shake256 to get the private key.

An adversary who controls the device will definately able to predict the shake256 algorithm's output and consequently can have the custody of the private key. With quantum random number generator , the dependency on the system parameters can be removed. The algorithm can directly call the QxEaaS API to fetch random bytes for shake256. Now even if the adversary controls the system parameters he won't be able to predict or hijack the bytes from quantum API.

Code Description

Here we demonstrate using quantum randomness from the QRNG API for key generation in post-quantum cryptography.

We initialize a cryptographically secure pseudo-random number generator (AES256-CTR DRBG) using the entropy. This entropy seed ensures the PRNG output is uniformly random and unpredictable.

We then perform a Kyber512 lattice-based key exchange, setting the Kyber PRNG seed to output from our DRBG. This guarantees the private/public key pairs generated have maximum entropy.

After encrypting a message with the recipient's public key, we decrypt it back with the private key. The shared secrets match, showing correct encryption and decryption.

By front-loading the key generation process with quantum randomness, we protect the security of the lattice-based cryptosystem even against quantum computers that could potentially break traditional public key schemes. Quantum randomness is a core requirement for post-quantumsecurity, as classical randomness generation could be compromised by large-scale quantum computers.

This demonstration therefore shows how quantum entropy from QRNG can enhance the assurance of newer post-quantum cryptographic protocols that will become increasingly important for long-term security in the quantum era.

kyber-img

We describe the Kyber algorithm which consists of the following steps:

  • 1It fetches random entropy from an API endpoint.
  • 2Initializes a cryptographic random number generator.
  • 3Performs a key exchange using Kyber512.
  • 4Encrypts and Decrypts a user-provided message.
  • 5Verifies the integrity of the shared secret.
QRNG bytes used : 1 KiB
icon

Check out the full source code on GitHub

View the source code for Kyber on our GitHub

Prerequisites:

  • os
  • requests
  • dotenv
iconDEEP DRIVE
Kyber Setup Guide
icon

Setup

Installation

Import Libraries

Retrieve Access Token and Entropy Size

Initialize DRBG

Perform key exchange

Verify decryption worked

Dilithium

link icon
What is Dilithium?

Crystals-Dilithium is analogous to ECDSA except that it is a quantum-resistant digital signature algorithm based on lattice-based cryptography primitives similar to Kyber described above. Crystals-Dilithium uses a private key to sign digital documents, and any verifier can use the public key to check the signature.

Quantum Entropy and Dilithium

The private keys in Crystals-Dilithium are extracted from system parameters, which makes it prone to side channel attacks and key prediction. Quantum entropy can aid in providing truly random bytes to the key generation process, making it resistant to many side channel or key prediction attacks.

Code Description

This code implements digital signatures using the Dilithium post-quantum signature scheme from the dilithium-py Python module. Like in traditional cryptography, Dilithium uses public-private key pairs for signing and verification.

The sign_and_verify function first retrieves quantum entropy from QRNG to inject randomness into the signing process. It appends 16 bytes of entropy to the message before signing.

This ensures the resulting signature is not predictable or linkable based solely on the message contents. The quantum entropy strengthens the security of the private key generation process, making it impossible to predetermine the secret key.

Without a cryptographically strong entropy source, there is a possibility an attacker could determine the private key through analysis. However, by using quantum entropy from Quantum, the keys are completely random and unpredictable.

The function then generates a Dilithium key pair, signs the message with entropy, and verifies the signature. This demonstrates how Dilithium signatures can benefit from integration with a quantum entropy source.

The main function allows choosing between Dilithium2, Dilithium3, or Dilithium5 security levels and iterating the sign/verify process. Overall, this code provides a working example of post-quantum digital signatures secured by quantum randomness from Quantum. The modularity also makes it easy to experiment with different Dilithium parameters.

dilithiumimg

We describe the Dilithium algorithm which consists of the following steps:

  • 1Retrieves an access token and entropy size from environment variables.
  • 2Fetches random entropy from an API endpoint using the provided access token and entropy size.
  • 3Generates a Dilithium key pair (public and private keys)
  • 4Signs the user-provided message (concatenated with the 16-byte entropy) using the private key.
  • 5Verifies the signature using the public key.
  • 6Prints the public key, signature, and the verification result.
QRNG bytes used : 1 KiB
icon

Check out the full source code on GitHub

View the source code for Dilithium on our GitHub

Prerequisites:

  • os
  • requests
  • dotenv
iconDEEP DRIVE
Dilithium Setup Guide
icon

Setup

Installation

Import Libraries

Define Function to sign and verify

Prompts the user for a message

Data Sampling

link icon
What is Data Sampling?

Let's assume you want to find out what music people like in your city. Interviewing everyone would be time-consuming and tedious. A random data sample generator will help you in this situation by selecting a small group of people from across the city, ensuring that everyone has a fair chance to be chosen. By analyzing the musical tastes of this random sample, you can get a good idea of what the whole town enjoys without having to talk to everyone. This is similar to how many fields use random data samples to test software, understand customer preferences, or analyze trends without large amounts of data.

Similarly, when we need data samples for testing or training, it is very important for us to take random data samples for use. This ensures that the model is trained and tested on multiple data samples, making it more accurate in the application.

Quantum Entropy and Data Sampling

Pseudorandom number generators can produce random numbers, but for picking truly fair data samples, we need something more chaotic. With QxEAAS and quantum randomness, it can be ensured that every data sample has a truly equal chance of being picked, even if someone requests a random number a million times. This randomness from QxEAAS guarantees that the data samples are unbiased, making the analysis more reliable and accurate.

Code Description

This code implements taking random samples from a CSV file using quantum entropy for randomness.

It retrieves random numbers from the Quantum entropy API to generate random indices for sampling rows from the CSV. These random indices ensure the sampled rows are unpredictable and not linkable to the data contents.

Without a cryptographically strong entropy source, it may be possible for an attacker to determine the patterns in how rows are selected for sampling. But by using quantum entropy from Quantum, the indices are completely random, and each sample provides an independent and unbiased view of the data.

The get_random_int() function processes the quantum random number to extract a random integer, modulo the total number of rows. This integer is then used as an index to select a random row from the CSV using Pandas.

By injecting quantum randomness at this early stage of sampling, the entire downstream analysis relying on the samples gains an increased level of security. No external party can potentially influence or bias the sampling process.

The modularity of sourcing entropy separately also allows integrating with other QRNG services if needed. Overall, this code presents a simple yet secure way to randomly sample CSV data, leveraging quantum entropy for unpredictability.

QRNG bytes used : 1 KiB
icon

Check out the full source code on GitHub

View the source code for Sampling on our GitHub

Prerequisites:

  • os
  • requests
  • dotenv
iconDEEP DRIVE
Data Sampling Setup Guide
icon

Import Libraries

Request random number

Main function

Randomized Simulation

link icon
What is Randomized Simulation?

The roots of randomized simulations trace back to the invention of the Monte Carlo method (Metropolis and Ulam, 1949), which computes its results based on repeated random sampling and subsequent statistical analysis. The most recent applications of randomized simulations is in deep learning. The data on which these models are trained is not sufficient to emulate real- word scenarios and this stems from the fact that learning to act in the physical world is orders of magnitude more difficult. The real life parameters of learning in these models can be generated by simulation in a faster, cheaper and safer way with unmatched diversity. Again, generation of these random numbers depends on algoritmic processes which define the parameters of high degree and lower degree of importance in the learning data. Additionally, the limited deterministic nature of pseudorandom number generator would have huge gap in approximation of real events. High entropy randomness can be used to define various parameters in the simulators to provide learning data with high degree of approximation to the real world.

Quantum Entropy and Randomized Simulation

Random numbers enable a simulation to include the variability that occurs in real life. Each place where randomness is required within a simulation uses a separate random stream . This enables a change to be made to one aspect of a simulation, without affecting the random occurrences that will happen at other areas. QxEaaS can be used to intantiated whenever a random stream is required to get more realistic parameter set.

Code Description

This code implements a Monte Carlo sales simulation that utilizes quantum entropy for random number generation.

It retrieves random numbers from the quantum entropy API to introduce randomness in important inputs like sales targets, percentage to target, and distribution sampling.

By leveraging quantum randomness, these inputs are made cryptographically random and unpredictable. Without a strong entropy source, there exists a possibility that an external party could influence or bias the simulation results.

The qrng_normal() and qrng_choice() functions generate random numbers following normal distributions and random selections from lists, using the quantum entropy. This quantum randomness is injected at the core of the simulation process right from generating inputs.

Multiple simulations are run in a loop, each time relying on fresh quantum entropy. Key statistics like total sales, commissions and targets are tracked over many runs.

The results provide an unbiased, independent view of the sales process. By basing the critical randomizations on quantum entropy, the overall analysis gains confidence that no outside force could potentially skew or influence the outcomes.

In summary, this code presents a Monte Carlo simulation model that is made robust and secure against prediction or manipulation by integrating quantum randomness from the onset, enhancing reliability of the inferred insights.

simulation-img
QRNG bytes used : 1 KiB
icon

Check out the full source code on GitHub

View the source code for Simulation on our GitHub
iconDEEP DRIVE
Randomized simulation Setup Guide
icon

Import Libraries

Define Variables

Define Function to generate random numbers

Define Function to generate random numbers from given list

Generate Data frame

Define Function to return commission rate

Loop through many simulations

Display Result

QRNG in Blockchain

link icon

The core of the encryption utilized in blockchains relies on secure randomness. Cryptographic hash functions are crucial for creating a private key for a cryptocurrency wallet, as they make it extremely difficult to guess the private key of a given wallet.

Decentralized consensus is limited by the number of messages that can be sent in a given time (throughput) and the time it takes to send a message across the network (latency). In a public blockchain where thousands of decentralized participants must reach agreement, it would not be practical for every node to send messages to every other node. To limit the number of messages sent to reach consensus, Bitcoin uses Proof of Work (PoW) as a source of randomness that determines which block is added to the blockchain.

Randomness is also often used in proof-of-stake (PoS) systems to enforce a fair and unpredictable distribution of validator responsibilities. If a malicious actor can influence the source of randomness used in the selection process, they can increase their chances of being selected and compromise the security of the network. Quantum randomness becomes relevant in these scenarios. If a QRNG can select validators with its true entropy, it makes it impossible for any adversary to predict the next set of validator nodes.

In this section, we will provide an example of how blockchains can utilize randomness from the quantum source and make themselves resistant to any adversary that aims to hijack the complete network by predicting the validators. To achieve this, we need a blockchain to fetch the quantum entropy from any source. Since blockchains are closed systems, it's impossible for them to fetch randomness directly from the API; rather, a relayer also called an Oracle can fetch the randomness from the API and submit the results to the blockchain using its smart contract. The smart contracts upon receiving the true randomness can utilize it to make any decisions on the network.

Solidity Smart Contract

A Solidity Smart Contract is a program written on the Ethereum blockchain that automatically executes agreements when certain conditions are met. Instead of needing a person to restock or manage the money, the smart contract can handle everything. This is useful for creating secure and transparent systems for things like online marketplaces, voting, or even managing loans, all without relying on a middleman.

Remix Ethereum IDE

Remix Ethereum IDE is like a building kit for smart contracts. It's a user-friendly platform that combines several tools in one place. You can write the smart contract code in a clear editor, compile it to check for errors, and even deploy it to the blockchain. Remix is great for both beginners and experts, offering a visual debugger to understand how the smart contract works and fix any issues. It makes developing secure and functional smart contracts on the Ethereum blockchain much easier.
Code Description

This code demonstrates using quantum randomness for securing smart contracts by initializing on-chain random numbers with entropy from a QRNG.

It retrieves 64 bits of cryptographically random data from the QRNG API. These random bits are then sent and stored on the Ethereum blockchain using a smart contract.

By leveraging a QRNG, the random number injected into the smart contract is guaranteed to be unpredictable and independent of any external influences. Without a strong entropy source, there is a possibility an attacker could bias the random number initialization.

The get_random_bits_from_qrng_api() function processes the raw quantum randomness into a 64-bit integer. This integer is then passed to a smart contract using the Web3 library transaction API.

Once set on-chain, the random number is publicly verifiable but no party can determine its value beforehand due to the unpredictability of true quantum randomness.

Overall, this demonstration provides a blueprint for infusing quantum entropy securely into decentralized applications to strengthen on-chain unpredictability against manipulation or prediction attacks on smart contracts. The modularity also allows integrating different QRNG services.

By combining the strengths of blockchain transparency with quantum randomness, developers can build more robust applications resistant to tampering with initial conditions.

QRNG bytes used : 1 KiB
icon

Check out the full source code on GitHub

View the source code for Blockchain on our GitHub
iconDEEP DRIVE
QRNG Blockchain Setup Guide
icon

Getting Random Bits from QRNG API

Create a Solidity Smart Contract

In the Remix Ethereum IDE, create a new Solidity file and define a smart contract that can receive the random bits from the Python script.

Send the Random Bits to the Smart Contract

Quantum Randomness in AI: A Comprehensive Guide to QRNG API Integration

Introduction

In the rapidly evolving landscape of artificial intelligence, the quest for enhancing model performance and security is unending. One frontier that holds immense potential is the incorporation of true randomness through Quantum Random Number Generators (QRNG). Unlike classical pseudo-random generators, QRNGs harness the inherent unpredictability of quantum mechanics, providing truly random numbers that can elevate AI models' robustness and unpredictability.

In this comprehensive guide, we'll explore the need for quantum randomness in AI, dive into the QRNG API, and provide a step-by-step tutorial on integrating quantum randomness into neural networks through quantum random initialization and quantum dropout. By the end of this article, you'll understand how to leverage QRNG to potentially improve your AI models and stay ahead in the generative AI landscape.

The Need for Quantum Randomness in AI

Randomness plays a crucial role in machine learning algorithms, particularly in neural network initialization and regularization techniques like dropout. Traditional methods rely on pseudo-random number generators, which, despite being deterministic, simulate randomness adequately for many applications. However, as models become more complex and security concerns rise, the limitations of pseudo-randomness become apparent.
  • Quantum Random Number Generators (QRNGs) :offer a solution by providing truly random numbers derived from quantum phenomena. This level of randomness can.
  • Enhance Security :True randomness is unpredictable, making models less susceptible to attacks that exploit deterministic patterns.
  • Improve Performance :Some studies suggest that true randomness can lead to better generalization in neural networks.
  • Innovate AI Development :Incorporating quantum elements opens new avenues for research and development in AI.
By integrating QRNGs into AI models, developers can explore these benefits and contribute to cutting-edge advancements in the field.

Overview

Below is the flowchart representing the approach:

simulation-img

Understanding the QRNG API

To harness quantum randomness, we'll use a QRNG API that provides base64-encoded quantum random numbers. Here's a breakdown of the `QuantumRandomGenerator` class, which interfaces with the API:

Key Features:

  • API Token Authentication : Securely access the API using your unique token.
  • Size Limitation Handling : The API supports sizes up to 512 bytes to ensure efficient entropy usage.
  • Data Conversion : Retrieves random numbers in bytes, decodes from base64, and converts to a NumPy array.

Integrating QRNG into Neural Networks

Let's dive into how quantum randomness can be integrated into neural networks, specifically through quantum random initialization and quantum dropout.

Quantum Random Initialization

Weight initialization is critical in neural networks, influencing the convergence and performance of models. By initializing weights with quantum random numbers, we introduce true randomness into the network.

Implementation

Explanation:

  • QuantumRandomUniform Function : Generates a tensor with a uniform distribution of quantum random numbers within a specified range `[a, b)` .
  • QuantumInitializedLinear Class : Subclasses `nn.Linear` to override the `reset_parameters` method, initializing weights and biases with quantum random numbers.

Quantum Dropout

Dropout is a regularization technique that randomly zeros out some neurons during training to prevent overfitting. Using quantum randomness for dropout introduces true unpredictability in neuron selection.

Implementation :

Explanation:

  • QuantumDropout Function : Applies dropout using quantum random numbers to decide which neurons to drop.
  • Normalization : Random numbers are normalized to `[0, 1)` to create a dropout mask.

Benchmarking Quantum vs. Classical Models

To evaluate the impact of quantum randomness, we'll compare a quantum-initialized model with a classical one on the MNIST dataset.

Quantum Model Implementation

Training Function

Classical Model Implementation

For comparison, we'll train a classical model with standard initialization and dropout.

Evaluation

After training both models, evaluate their performance on a validation set to compare accuracy, loss, and generalization capabilities. Monitor metrics like:
  • Training and Validation Accuracy
  • Loss Curves
  • Overfitting Signs

Conclusion

Integrating quantum randomness into AI models is an exciting frontier that combines quantum physics with machine learning. By using QRNG for weight initialization and dropout, we introduce true randomness that could enhance model performance and security. While the benefits may vary depending on the application, this approach opens new avenues for innovation in AI development.

Key Takeaways:

  • Quantum randomness provides unpredictability that cannot be replicated by classical pseudo-random generators.
  • Incorporating QRNG into neural networks is feasible and can be implemented with manageable modifications to standard practices.
  • Benchmarking against classical models is essential to assess the practical benefits.

Access the Complete Tutorial

icon

Check out the full source code on GitHub

To explore the full implementation details and run the code yourself, access the complete tutorial in our Jupyter Notebook

This resource includes:

  • 1Complete source code with explanations.
  • 2Step-by-step instructions for setting up the environment.
  • 3Extended evaluation and visualization of results.

Stay Ahead in AI Innovation

By embracing quantum technologies today, you position yourself at the forefront of AI innovation. Whether you're a researcher, developer, or enthusiast, experimenting with QRNG can provide valuable insights and potentially unlock new capabilities in your AI projects.
Get Started Now : Integrate quantum randomness into your models and share your findings with the community!
For further questions or support, feel free to reach out in the comments or open an issue on the GitHub repository.here

Utilizing Quantum Randomness as Noise in Neural Networks: A Practical Guide

link icon
Unlock the power of Quantum Random Number Generators (QRNG) to enhance your AI models.
Introduction

As artificial intelligence (AI) continues to evolve, the quest for integrating cutting-edge technologies into AI models intensifies. One such frontier is the incorporation of Quantum Random Number Generators (QRNGs) into neural networks. Unlike classical pseudo-random number generators (PRNGs), QRNGs harness the inherent unpredictability of quantum mechanics to produce true randomness.

In this tutorial, we'll explore how to integrate QRNGs into a neural network designed to recognize handwritten digits using the MNIST dataset. We'll walk through the essential components of the codebase, explain how quantum randomness is infused into the model, and demonstrate the potential benefits of this approach.

Why Quantum Randomness in AI?

Traditional PRNGs are algorithmically determined and can, in theory, be predicted if the initial seed is known. QRNGs, on the other hand, rely on quantum phenomena, making their outputs fundamentally unpredictable. Integrating QRNGs into AI models can:
  • Enhance Security :Improve resistance against attacks that exploit predictable randomness.
  • Increase Robustness :Introduce true randomness to prevent overfitting and improve generalization.
  • Innovation :Open new avenues for research at the intersection of quantum computing and AI.

Overview of the Implementation

We'll build a simple neural network with quantum randomness integrated into:
  • QuantumRandomGenerator :Fetches quantum random numbers from a QRNG API.
  • QuantumRandomBuffer :Manages quantum random numbers efficiently.
  • QRNGLayer :A neural network layer that injects quantum noise into its computations.
  • QuantumNeuralNetwork :The full model utilizing QRNG layers.
simulation-img

Prerequisites

  • Python 3.6 or higher
  • PyTorch
  • NumPy
  • Requests library
  • Scikit-learn
  • Seaborn
  • Matplotlib
  • An API token for your Quantum eMotion's Entropy-as-a-Service (EaaS) API
iconDEEP DRIVE
1. Setting Up the Quantum Random Generator
icon
First, we create a class to interact with the QRNG API and fetch random numbers.

Import necessary libraries

Key Points:
  • API Interaction :The class communicates with the QRNG service using HTTP requests.
  • Data Handling :Fetches random bytes and converts them into a NumPy array.

2. Efficient Quantum Random Number Management

To handle large requests efficiently, we implement a buffer system.
Key Points:
  • Buffering :Reduces the number of API calls by fetching larger chunks of random numbers.
  • Optimal Chunk Size :Adjusts the request size based on API limitations.

3. Building Quantum-Enhanced Neural Network Layers

We create a custom layer that injects quantum randomness into the inputs.
Key Points:
  • Noise Injection :Adds quantum noise to the input features.
  • Scalability :Supports GPU acceleration by moving tensors to CUDA if available.

4. Constructing the Quantum Neural Network

We assemble the network using our custom QRNG layers.
Key Points:
  • Architecture :Designed for the MNIST dataset with input size 28x28 pixels.
  • Activation and Regularization :Uses ReLU activations and dropout for better generalization.

5. Training the Model

Preparing the Dataset

Setting Up Training Components

6. Evaluating the Model

Key Points:
  • Inference Mode :Disables gradient computation for efficiency.
  • Accuracy Measurement :Uses scikit-learn for calculating accuracy.

7. Visualizing the Results

You can plot a confusion matrix to visualize the model's performance across different classes.

Training and Testing Loop

Key Points:
  • Device Configuration :Utilizes GPU if available for faster training.
  • Loss Calculation :Uses cross-entropy loss suitable for multi-class classification.

Access the Complete Tutorial

icon

Check out the full source code on GitHub

To explore the full implementation details and run the code yourself, access the complete tutorial in our Jupyter Notebook

Conclusion

In this tutorial, we've demonstrated how to integrate quantum randomness into a neural network. By injecting quantum noise into the inputs of our custom layers, we aim to enhance the model's robustness and generalization capabilities.

Potential Benefits:

  • Improved Generalization :The stochastic nature may help prevent overfitting.
  • Enhanced Security :True randomness can make models more resistant to certain types of attacks.

Next Steps:

  • Experiment with Different Architecture :Try deeper networks or convolutional layers.
  • Adjust Noise Scaling :Fine-tune the `noise_scale` parameter to observe its effect.
  • Apply to Other Datasets :Test the approach on more complex datasets like CIFAR-10 or ImageNet.

References

Embrace the future by integrating quantum technologies into your AI models today!If you found this tutorial helpful, don't forget to clap and share it with your fellow developers interested in quantum computing and AI.

Generating Synthetic Data with Quantum Randomness: A Comprehensive Guide

link icon
Unlock the power of Quantum Random Number Generators (QRNGs) to enhance your AI models and generate high-quality synthetic data.
Introduction

As artificial intelligence (AI) continues to evolve, the demand for high-quality data grows exponentially. Synthetic data generation has emerged as a viable solution to augment datasets, especially when real data is scarce or sensitive. Traditional methods rely on pseudo-random number generators (PRNGs), which, while efficient, are deterministic and potentially predictable.

Quantum Random Number Generators (QRNGs) Leveraging the inherent unpredictability of quantum mechanics, QRNGs provide true randomness, offering potential benefits in enhancing model robustness and security.In this tutorial, we'll explore how to integrate QRNGs into a Variational Autoencoder (VAE) for synthetic data generation. We'll cover:
  • Setting up a QRNG with a fallback mechanism.
  • Building a QRNG-enhanced VAE.
  • Training the model on real data.
  • Generating and evaluating synthetic data.
Why Quantum Randomness in AI?

True Randomness vs. Pseudo-Randomness

  • Pseudo-Random Number Generators (PRNGs) :Generate sequences that appear random but are deterministic if the initial seed is known.
  • Quantum Random Number Generators (QRNGs) :Utilize quantum phenomena to produce numbers that are fundamentally unpredictable.

Benefits of Integrating QRNGs

  • Enhanced Security :True randomness reduces vulnerability to attacks exploiting predictable patterns.
  • Improved Robustness :Introducing genuine randomness can prevent overfitting and improve generalization.
  • Innovation :Opens new research avenues at the intersection of quantum computing and AI.

Overview

Below is the flowchart representing the data generation process:
simulation-img

Prerequisites

  • Python 3.6+
  • PyTorch
  • NumPy
  • Pandas
  • Scikit-learn
  • Requests library
  • An API token for your Quantum eMotion's Entropy-as-a-Service (EaaS) API
iconDEEP DRIVE
1. Setting Up the Quantum Random Number Generator
icon
First, we'll create a `QuantumRandomGenerator` class to interact with the QRNG API. This class includes a caching mechanism to minimize API calls.
Important Notes:
  • API Endpoint :
  • API Token :Ensure you have a valid API token and set it appropriately.
  • Caching :Uses an LRU cache to minimize API calls and handle rate limits.

2. Managing Quantum Random Numbers Efficiently

To handle large requests and minimize API calls, we'll implement a buffer system with the `QuantumRandomBuffer` class.
Key Features:
  • Buffering :Stores quantum random numbers to reduce API requests.
  • Optimal Chunk Size :Ensures requests are made in sizes supported by the API.

3. Building the QRNG-Enhanced Variational Autoencoder

3.1 Preparing the Dataset

We'll create a custom `TabularDataset` class to handle our tabular data.

3.2 Implementing the QRNG Variational Autoencoder

Key Components:
  • Quantum Noise Injection :Uses quantum random numbers in the reparameterization trick.
  • Encoder and Decoder :Standard VAE architecture with fully connected layers.
  • Reparameterization Trick :Allows backpropagation through stochastic variables.

4. Training the Model on Real Data

4.1 Preparing the Real Data

We'll use the Breast Cancer Wisconsin dataset from scikit-learn as our real dataset.

4.2 Initializing the Synthetic Data Generator

4.3 Training the VAE Model

Training Details:
  • Loss Function :Combines reconstruction loss (MSE) and KL divergence.
  • Optimizer :Uses Adam optimizer for efficient training.
  • Quantum Noise :Injected during the reparameterization step.

5. Generating Synthetic Data

Key Steps:
  • Quantum Latent Vectors :Uses quantum noise to generate latent vectors.
  • Decoding :Transforms latent vectors back to data space.
  • Post-processing :Inverse scaling and type casting to match original data.

6. Evaluating the Synthetic Data

Usage Example:

  • Statistical Similarity :Compare mean, standard deviation, and other statistics.
  • Data Distribution :Ensure the synthetic data follows similar distributions as the real data.

Results

Below are the graphs for best performing features that were replicated after training for 1 minutes on a Tesla T4 GPU for 300 epochs:
simulation-img
This refined graph offers a clear comparison between the feature "mean radius" distributions for the real data and synthetic data

Key Observations

  • Real Data (Blue) :The histogram and Kernel Density Estimation (KDE) line represent the actual distribution, showcasing its central tendency and spread.
  • Synthetic Data 2 (Red) :Both the histogram and KDE closely follow the real data, including peak alignment and tail coverage, indicating high-quality synthetic data generation.

Impact of Quantum Randomness

The QRNG integration provides true randomness in the latent space sampling, but the current implementation shows that additional architectural improvements might be needed to fully leverage this advantage. Future iterations could explore:

  • 1Varying the quantum noise scale during training
  • 2 Applying quantum randomness in other model components
  • 3Implementing adaptive noise scaling based on distribution matching metrics
These visualizations and analyses serve as a baseline for future improvements to the QRNG-enhanced VAE architecture.

Access the Complete Tutorial

icon

Check out the full source code on GitHub

To explore the full implementation details and run the code yourself, access the complete tutorial in our Jupyter Notebook

7. Conclusion

In this tutorial, we've successfully integrated quantum randomness into a Variational Autoencoder for synthetic data generation. By leveraging QRNGs, we've introduced true randomness into the model, potentially enhancing the quality and security of the generated data.

Key Takeaways:

  • QRNG Integration :Provides true randomness, enhancing unpredictability.
  • VAE Architecture :Effective for generating high-dimensional synthetic data.
  • Potential Applications :Data augmentation, privacy-preserving data sharing, and more.

Next Steps:

  • Experimentation :Try different datasets and observe the effects.
  • Hyperparameter Tuning :Adjust latent dimensions, hidden sizes, and training epochs.
  • Further Research :Explore integrating QRNGs into other models like GANs.

References

Embrace the future of AI by integrating quantum technologies into your models today! If you found this tutorial helpful, please share it with others interested in the exciting intersection of quantum computing and artificial intelligence.

Quantum Randomness Meets AI: Building a Quantum-Enhanced GAN

link icon
Unlock the power of Quantum Random Number Generators (QRNGs) to enhance your Generative Adversarial Networks (GANs) and take synthetic data generation to the next level.
Introduction

As artificial intelligence (AI) continues to evolve, the integration of quantum computing concepts offers exciting possibilities. One such integration is the use of Quantum Random Number Generators (QRNGs) in AI models. QRNGs leverage the inherent unpredictability of quantum mechanics to produce truly random numbers, unlike classical pseudo-random number generators (PRNGs), which are deterministic and potentially predictable.

In this tutorial, we'll explore how to incorporate QRNGs into a Generative Adversarial Network (GAN) to generate synthetic images resembling the MNIST handwritten digits. We'll cover:
  • Setting up the QRNG with a caching mechanism.
  • Building a Quantum Noise Generator using QRNG.
  • Implementing a GAN that uses quantum noise in the generation process.
  • Training the Quantum GAN and visualizing the results.
By the end, you'll have a working Quantum GAN that leverages quantum randomness to potentially enhance the diversity and unpredictability of the generated data.
Why Quantum Randomness in GANs?

Limitations of Pseudo-Randomness

  • Predictability :PRNGs are algorithmically generated and can be predicted if the initial seed is known.
  • Limited Entropy :PRNGs may not provide sufficient randomness for certain applications requiring high entropy.

Advantages of Quantum Randomness

  • True Randomness :QRNGs produce numbers based on quantum phenomena, ensuring unpredictability.
  • Enhanced Diversity :Incorporating quantum randomness may introduce more variation in generated data.
  • Security Benefits :True randomness can improve the robustness of models against certain types of attacks

Overview

Below is the flowchart representing the approach:
simulation-img

Prerequisites

  • Python 3.6+
  • PyTorch
  • NumPy
  • Requests library
  • Torchvision
  • An API token for your Quantum eMotion's Entropy-as-a-Service (EaaS) API
iconDEEP DRIVE
1. Setting Up the Quantum Random Number Generator
icon
We'll create a `QuantumRandomGenerator` class to interact with the QRNG API. This class includes a caching mechanism to minimize API calls.
Important Notes:
  • API Endpoint :
  • API Token :Ensure you have a valid API token and set it appropriately.
  • Caching Mechanism :Uses an LRU cache to minimize API calls and handle rate limits.

2. Managing Quantum Random Numbers Efficiently

We'll implement a buffer system with the `QuantumRandomBuffer` class to handle large requests and minimize API calls.
Key Features:
  • Buffering :Stores quantum random numbers to reduce API requests.
  • Optimal Chunk Size :Ensures requests are made in sizes supported by the API.

3. Implementing Quantum Noise for the GAN

3.1 Preparing the Dataset

We'll create a `QuantumNoiseGenerator` class that uses the `QuantumRandomBuffer` to generate noise for the GAN's latent space.
Explanation:
  • Uniform to Normal :Uses the Box-Muller transform to convert uniformly distributed quantum random numbers into normally distributed noise suitable for the GAN's latent space.
  • Batch Processing :Generates noise in batches to match the input requirements of the GAN.

4. Building the Quantum GAN

4.1 The Generator

4.2 The Discriminator

4.3 The Quantum GAN Class

Key Components:
  • Quantum Noise in Latent Space :Uses quantum-generated noise for the generator's input.
  • GAN Training Loop :Standard GAN training steps adapted to include quantum noise.
  • Loss Functions :Uses Binary Cross-Entropy Loss for both generator and discriminator.

5. Training the Quantum GAN

5.1 Training Function

Training Details:
  • Dataset :Uses the MNIST dataset of handwritten digits.
  • Checkpointing :Saves generated images every 100 batches for visualization.
  • Epochs and Batch Size :Configurable parameters to control training duration.

5.2 Running the Training

Important Notes:
  • API Token :Ensure your QRNG API token is set in the environment variable `API_TOKEN`.
  • Generated Images :Check the `generated_images` directory to see the outputs of the generator during training.

Quantum GAN Results Across Epochs

Below, you can see the progression of images generated by the Quantum GAN model across four different epochs. Observe how the clarity and quality improve over time.
Epoch 1Epoch 3Epoch 22
Epoch 1Epoch 1Epoch 1
Epoch 32
Epoch 1

Progression Details

  • Epoch 1 :Initial results are noisy and pixelated, with some round and circular shapes starting to form.
  • Epoch 3 :Outputs gain more clarity, beginning to form the structure of two distinct digits.
  • Epoch 22 :Clarity is significantly enhanced, with outputs showing much more refined shapes.
  • Epoch 32 :Achieves the best quality overall, with well-defined results and minimal noise.

Access the Complete Tutorial

icon

Check out the full source code on GitHub

To explore the full implementation details and run the code yourself, access the complete tutorial in our Jupyter Notebook

7. Conclusion

In this tutorial, we've successfully integrated quantum randomness into a GAN for image generation. By using a QRNG, we've introduced true randomness into the model's latent space, potentially enhancing the diversity and unpredictability of the generated images.

Key Takeaways:

  • Quantum Randomness :Provides a source of true randomness, which may improve model robustness.
  • GAN Architecture :Remains standard, allowing easy integration of quantum noise.
  • Potential Benefits :Opens avenues for research into the effects of quantum randomness on generative models.

Next Steps:

  • Experimentation :Try training the Quantum GAN on different datasets.
  • Hyperparameter Tuning :Adjust the latent dimension, learning rates, and network architectures.
  • Potential Benefits :Opens avenues for research into the effects of quantum randomness on generative models.

References

Embrace the future of AI by integrating quantum technologies into your models today! If you found this tutorial helpful, please share it with others interested in the exciting intersection of quantum computing and artificial intelligence.