Treesize 2.511/14/2022 ![]() > bytes2human(9856, symbols="customary_ext") Symbols can be either "customary", "customary_ext", "iec" or "iec_ext", # using slow, but compatible 'os.listdir' methodįull_path = os.path.abspath(os.path.join(start_path, entry))ĭef bytes2human(n, format='%(value).0f%(symbol)s', symbols='customary'):Ĭonvert n bytes into a human readable string based on format. If entry.is_dir(follow_symlinks = False):Įlif entry.is_file(follow_symlinks = False): # using fast 'os.scandir' method (new in version 3.5) Start_dir = os.path.normpath(os.path.abspath(sys.argv)) if len(sys.argv) > 1 else get_dir_size(start_path = '.'): My_cache_decorator = functools.lru_cache(maxsize=4096) PS i've used recipe 578019 for showing directory size in human-friendly format ( ) from _future_ import print_function The output is sorted by the directory size from biggest to smallest ones. If an argument is omitted, the script will work in the current directory. It also tries to benefit (if possible) from caching the calls of a recursive functions. The following script prints directory size of all sub-directories for the specified directory. The code: def humanized_size(num, suffix='B', si=False): Return (apparent_total_bytes, total_bytes) Total_bytes += os.lstat(dirpath).st_blocks * 512Īpparent_total_bytes += os.lstat(fp).st_sizeĬontinue # skip hardlinks which were already countedĪpparent_total_bytes += os.lstat(dp).st_size uses st.st_blocks for disk space used, thus works only on Unix-like systemsįor dirpath, dirnames, filenames in os.walk(path):Īpparent_total_bytes += os.lstat(dirpath).st_size.returns both: the apparent size (number of bytes in the file) and the actual disk space the files uses.I also came across this question, which has some more compact and probably more performant strategies for printing file sizes. If 1 > size = get_folder_size("c:/users/tdavis/downloads") Self.petabytes = self.PB = self / self._KB**4 Self.gigabytes = self.GB = self / self._KB**3 Self.kilobytes = self.KB = self / self._KB**1 Return super()._new_(cls, *args, **kwargs) Return ByteSize(sum(file.stat().st_size for file in Path(folder).rglob('*'))) Using pathlib I came up with this one-liner to get the size of a folder: sum(file.stat().st_size for file in Path(folder).rglob('*'))Īnd this is what I came up with for a nicely formatted output: from pathlib import Path Sum(f.stat().st_size for f in root_directory.glob('**/*') if f.is_file()) Recently I've been using pathlib more and more, here's a pathlib solution: from pathlib import Path In Python 3.5 and later, this package has been incorporated into the standard library and os.walk has received the corresponding increase in performance. If you use Python 3.4 or previous then you may consider using the more efficient walk method provided by the third-party scandir package. Nbytes = sum(d.stat().st_size for d in os.scandir('.') if d.is_file()) Can also be used to get file size and other file related information. Os.stat - st_size Gives the size in bytes. Thanks to ghostdog74 for pointing this out! To use os.path.getsize, this is clearer than using the os.stat().st_size method. os.path.getsize - Gives the size in bytes.Sum(os.path.getsize(f) for f in os.listdir('.') if os.path.isfile(f)) This walks all sub-directories summing file sizes: import osįor dirpath, dirnames, filenames in os.walk(start_path):Īnd a oneliner for fun using os.listdir ( Does not include sub-directories): import os ![]()
0 Comments
Leave a Reply.AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |