Skip to main content

Save to and Load From Files, Auto-Save, Preventing Data Loss

 


Save to and Load From Files

There are some things that primary storage in the RAM does better.
For example, it's easy to access, amend, or remove a piece of data held in a list (in the RAM).
Holding data in secondary storage in a file makes this more difficult. Or does it?
With Python, there's more than meets the eye


👉 The program below lets me add & remove events and dates into a diary system. It adds the name & date of an event to the 2D list. Or it searches for an existing name & date and removes it.

myEvents = []
def prettyPrint():
  print()
  for row in myEvents:
    print(f"{row[0] :^15} {row[1] :^15}")
  print()
while True:
  menu = input("1: Add, 2: Remove\n")
  if menu == "1":
    event = input("What event?: ").capitalize()
    date = input("What date?: ")
    row = [event,date]
    myEvents.append(row)
    prettyPrint()
  else:
    criteria = input("What event do you want to remove?: ").title()
    for row in myEvents:
      if criteria in row:
        myEvents.remove(row)

Manually saving and loading from this program to a file each time would be a massive faff. Instead, we can set up an auto-save by writing the save code at the end of our infinite loop.

Auto-Save

👉 At the bottom of the code, we are going to add an autosave just before the loop repeats.
Make sure this new code matches the indent for the while loop, so it is part of the loop.
myEvents = []
def prettyPrint():
  print()
  for row in myEvents:
    print(f"{row[0] :^15} {row[1] :^15}")
  print()
while True:
  menu = input("1: Add, 2: Remove\n")
  if menu == "1":
    event = input("What event?: ").capitalize()
    date = input("What date?: ")
    row = [event,date]
    myEvents.append(row)
    prettyPrint()
  else:
    criteria = input("What event do you want to remove?: ").title()
    for row in myEvents:
      if criteria in row:
        myEvents.remove(row)
  ########### THIS IS THE NEW BIT ########
  f = open("calendar.txt", "w") # Permissions set to 'w' because we are deleting the file and replacing it with the whole 2D list every time.
  f.write(str(myEvents)) # Need to cast the list to a single string
  f.close()
  ######################################### 


Preventing Data Loss 

Did you find the problem?

Yep. Every time we run the program, it creates a new, blank myEvents[] list which gets written to the file.

This overwrites any events in the file that we saved when we ran the program previously.

To solve this, we set up the program to load any pre-existing data from the file into the myEvents list at the very start of the code.

Pay close attention to the eval() function. It's the special sauce here

👉 eval() takes the text from the file, converts it into running code, and assigns it to myEvents[] as a 2D list. Good, eh?

myEvents = []
####### THIS IS THE NEW BIT ################
f=open("calendar.txt","r") # Only need read permissions here
myEvents = eval(f.read())
f.close()
########################################
def prettyPrint():
  print()
  for row in myEvents:
    print(f"{row[0] :^15} {row[1] :^15}")
  print()
while True:
  menu = input("1: Add, 2: Remove\n")
  if menu == "1":
    event = input("What event?: ").capitalize()
    date = input("What date?: ")
    row = [event,date]
    myEvents.append(row)
    prettyPrint()
  else:
    criteria = input("What event do you want to remove?: ").title()
    for row in myEvents:
      if criteria in row:
        myEvents.remove(row)
  
  f = open("calendar.txt", "w") 
  f.write(str(myEvents)) 
  f.close()




Comments

Popular posts from this blog

Web Scraping

 Web Scraping Some websites don't have lovely APIs for us to interface with. If we want data from these pages, we have to use a tecnique called scraping. This means downloading the whole webpage and poking at it until we can find the information we want. You're going to use scraping to get the top ten restaurants near you. Get started 👉 Go to a website like Yelp and search for the top 10 reastaurants in your location. Copy the URL.   url = "https://www.yelp.co.uk/search?find_desc=Restaurants&find_loc=San+Francisco%2C+CA%2C+United+States"   Import libraries 👉 Import your libraries. Beautiful soup is a specialist library for extracting the contents of HTML and helping us parse them. Run the Repl once your imports are sorted because we want the Beautiful Soup library to be installed (it'll run quicker this way). import requests from bs4 import BeautifulSoup url = "https://www.yelp.co.uk/search?find_desc=Restaurants&find_loc=San+Francisco%2C+CA%2C+Unite...

Client/Server Logins

 Client/Server Logins Waaay back when we learned about repl.db, we mentioned the idea of a client/server model for storing data in one place and dishing it out to multiple users. This model is the way we overcome the issue with repl.db of each user getting their own copy of the database. Well, now we can use Flask as a webserver. We can build this client server model to persistently store data in the repl (the server) and have it be accessed by multiple users who access the website via the URL (the clients). Get Started Previously, we have built login systems using Flask & HTML. We're going to start with one of those systems and adapt it to use a dictionary instead. 👉 First, let's remind ourselves of the way the system works. Here's the Flask code. Read the comments for explanations of what it does: from flask import Flask, request, redirect # imports request and redirect as well as flask app = Flask(__name__, static_url_path='/static') # path to the static fil...

HTTP & Sessions

 HTTP & Sessions One of the main protocols (rules that govern how computers communicate) on the web is called HTTP. HTTP is what is known as a stateless protocol. This means that it doesn't 'remember' things. It's a bit like having a conversation with a goldfish. You can ask a question and get a reply, but when you ask a follow up question, the original has already been forgotten, as has who you are and what you were talking about. So if HTTP is stateless, how come my news site remembers to give me the weather for my home town, my preferred South American river based online store tells me when it's time to order more multivitamins, and I'm justifiably proud of my #100days success streak? The answer is......... Sessions Sessions are a way of storing files on your computer that allows a website to keep a record of previous 'conversations' and 'questions' you've asked. By using sessions, we can store this info about the user to access later....