Create a deep learning time series forecasting model using TensorFlow and LSTM networks. Learn to build windowed datasets, train neural networks with Keras API, make future predictions, and deploy models - essential for any time-series prediction task.
This document explains the TensorFlow example provided in tensorflow_example.py.
The example demonstrates how to:
import os import numpy as np import pandas as pd import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import LSTM, Dense from sklearn.preprocessing import MinMaxScaler import matplotlib.pyplot as plt output_dir = os.path.dirname(os.path.abspath(__file__)) def load_data(filename): data_path = os.path.join(output_dir, filename) df = pd.read_csv(data_path) df['Date'] = pd.to_datetime(df['Date']) df.set_index('Date', inplace=True) return df df = load_data('energy_consumption.csv')
This section imports necessary modules and loads the energy consumption data from a CSV file. The file path is constructed using the script's directory to ensure it works regardless of the current working directory.
Use script generate_energy_consumption_data.py to generate test data which we'll be working with:
python generate_energy_consumption_data.py
def create_dataset(data, time_steps=1): X, y = [], [] for i in range(len(data) - time_steps): X.append(data[i:(i + time_steps), 0]) y.append(data[i + time_steps, 0]) return np.array(X), np.array(y) data = df['Consumption'].values.reshape(-1, 1) scaler = MinMaxScaler(feature_range=(0, 1)) data_scaled = scaler.fit_transform(data) time_steps = 60 X, y = create_dataset(data_scaled, time_steps) X = np.reshape(X, (X.shape[0], X.shape[1], 1))
Here, we preprocess the data by:
model = Sequential([ LSTM(50, activation='relu', input_shape=(time_steps, 1), return_sequences=True), LSTM(50, activation='relu'), Dense(1) ]) model.compile(optimizer='adam', loss='mse') history = model.fit(X_train, y_train, epochs=50, batch_size=32, validation_split=0.2, verbose=1)
This code builds an LSTM model using TensorFlow's Keras API and trains it on the preprocessed data.
train_predict = model.predict(X_train) test_predict = model.predict(X_test) # Invert predictions to original scale train_predict = scaler.inverse_transform(train_predict) y_train = scaler.inverse_transform([y_train]) test_predict = scaler.inverse_transform(test_predict) y_test = scaler.inverse_transform(y_test.reshape(-1, 1)) last_60_days = data_scaled[-60:] future_days = 30 future_pred = [] for _ in range(future_days): next_pred = model.predict(last_60_days.reshape(1, 60, 1)) future_pred.append(next_pred[0, 0]) last_60_days = np.roll(last_60_days, -1) last_60_days[-1] = next_pred future_pred = scaler.inverse_transform(np.array(future_pred).reshape(-1, 1))
This section uses the trained model to make predictions on the training and test data, and then forecasts future energy consumption for the next 30 days.
model_path = os.path.join(output_dir, 'energy_forecast_model.keras') model.save(model_path) loaded_model = tf.keras.models.load_model(model_path)
Here, we save the trained model and then load it back, simulating a deployment scenario.
# Visualize the results plt.figure(figsize=(15, 6)) plt.plot(plot_index, y_test, label='Actual') plt.plot(plot_index, test_predict, label='Predicted') # Forecast future values last_date = df.index[-1] future_dates = pd.date_range(start=last_date + pd.Timedelta(days=1), periods=future_days) plt.plot(future_dates, future_pred, label='Forecast') plt.title('Energy Consumption Forecast') plt.xlabel('Date') plt.ylabel('Energy Consumption') plt.legend() plt.savefig(os.path.join(output_dir, 'energy_forecast.png')) plt.close() print("Forecasting and visualization completed. Check the output directory for results.")
Finally, we visualize the actual values, predictions, and future forecast using matplotlib.
To run this example:
Ensure you have the required libraries installed:
pip install tensorflow numpy pandas scikit-learn matplotlib
Generate the sample data by running:
python generate_energy_data.py
Run the TensorFlow example:
python tensorflow_example.py
The script will load the data, train the model, make predictions, forecast future values, and save a visualization of the results in the output directory.

We use pandas to load and preprocess time series data.
The create_dataset function creates a windowed dataset suitable for sequence prediction.
Significance: Proper data preparation is crucial for time series forecasting, ensuring the model can learn from historical patterns.
We use MinMaxScaler to normalize the data between 0 and 1.
Significance: Normalization helps the neural network converge faster and improves performance, especially for time series data with varying scales.
We use TensorFlow's Keras API to build an LSTM model, which is well-suited for sequence prediction tasks.
Significance: LSTM networks can capture long-term dependencies in time series data, making them ideal for complex patterns in energy consumption.
We split the data into training and testing sets, train the model, and evaluate its performance.
Significance: This process helps us understand how well the model generalizes to unseen data, which is crucial for real-world applications.
We use the trained model to forecast energy consumption for the next 30 days.
Significance: This demonstrates how the model can be used for practical business planning and decision-making in energy management.
We save the trained model and load it back, simulating a deployment scenario.
Significance: This feature is crucial in real-world data engineering pipelines, allowing models to be trained once and deployed multiple times.
We use matplotlib to visualize the actual vs. predicted values and the future forecast.
Significance: Visualization is key for communicating results to stakeholders and identifying patterns or anomalies in the forecast.