Keras and Tensorflow use to be seperate libraries but Keras is now apart of Tensorflow
In recent versions of Tensorflow, it has become hard to create and use neural networks without also using Keras
However, there are a few features that are still useful
Keras Neural Network With New Data Set
MNIST Data Set
Input: images of handwritten digits (28x28 pixels)
Output: digit shown in the image
We will go into more detail on image manipulation and classification when we cover neural networks, but we can do a preliminary analysis on the MNIST data set with regular neural networks.
import matplotlib.pyplot as pltfrom tensorflow.keras.datasets import mnist## NOTE: The same data set is avaliable from Scikit-Learn
/Users/butlerju/Library/Python/3.9/lib/python/site-packages/urllib3/__init__.py:34: NotOpenSSLWarning: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with 'LibreSSL 2.8.3'. See: https://github.com/urllib3/urllib3/issues/3020
warnings.warn(
# Display a few sample imagesnum_samples =5# Change this to display a different number of imagesplt.figure(figsize=(12, 4))for i inrange(num_samples): plt.subplot(1, num_samples, i +1) plt.imshow(train_images[i], cmap='gray') plt.title(f"Label: {train_labels[i]}")plt.show()
import numpy as npimport tensorflow as tffrom tensorflow.keras.datasets import mnistfrom tensorflow.keras.models import Sequentialfrom tensorflow.keras.layers import Densefrom tensorflow.keras.utils import to_categorical
# Normalize pixel values to be between 0 and 1train_images, test_images = train_images /255.0, test_images /255.0
# One-hot encode the labelstrain_labels = to_categorical(train_labels)test_labels = to_categorical(test_labels)
np.shape(train_images[i])
(28, 28)
def mnist_classification_model ():# Define the model model = Sequential()# Add layers to the model model.add(Flatten(input_shape=(28, 28))) # Flatten the 28x28 input images to a 1D vector model.add(Dense(128, activation='relu')) # Fully connected layer with ReLU activation model.add(Dense(64, activation='relu')) # Another fully connected layer with ReLU activation model.add(Dense(10, activation='softmax')) # Output layer with 10 neurons for 10 classes (digits 0-9)# Compile the model model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])return model
mnist_model = mnist_classification_model ()
history = mnist_model.fit(train_images, train_labels, epochs=10, batch_size=32, validation_split=0.2, verbose=1)
# Evaluate the model on the test datatest_loss, test_accuracy = model.evaluate(test_images, test_labels)print(f"Test accuracy: {test_accuracy *100:.2f}%")
# summarize history for accuracyplt.plot(history.history['accuracy'])plt.plot(history.history['val_accuracy'])plt.title('model accuracy')plt.ylabel('accuracy')plt.xlabel('epoch')plt.legend(['train', 'validation'], loc='upper left')plt.show()
Custom Loss Functions
Tensorflow Variables
GPU Support
Preprocessing Layers
Flatten Layer
Normalization Layer
CategoryEncoding Layer
Image Preprocessing Layers
Saving Trained Models
Other Tensorflow Customizations
Tensorflow offers many other customizations besides the ones discussed here: * Custom Activation Functions * Custom Initialization Schemes * Custom Layers * Custom Training Loops
These are unlikely to be used in this course, but are good to know about for the future