For a slow model I added caching fields into my Django model with fields that are updated on
New datasets now have the cached fields but I needed to update the old ones too.
The table is pretty big, so I wanted a progressbar (as always, tqdm).
The second problem is that the Djangos model
save method returns
None and for a lot of elements this is a pretty big list of them.
The Python build-in library
collections for the rescue.
deque avoids storing all the elements (because of maxlen=0).
The code I ran in my shell_plus:
import collections from tqdm import tqdm iterator = map(lambda x: x.save(), MyModel.objects.all()) with tqdm(iterator, total=MyModel.objects.count(), ascii=True) as pbar: collections.deque(pbar, maxlen=0)
For really big tables, the initial creation of the iterator will take quite some time too!
To save memory on really large tables