Ben Chuanlong Du's Blog

It is never too late to learn.

Python Logging Made Stupidly Simple With Loguru

The best logging package for Python!

  1. Note that the default logging level is DEBUG in loguru and it is not allowed to change the logging level of an created logger object in loguru. You can refer to changing-the-level-of-an-existing-handler and Change level of default handler on ways to changing logging level in loguru.

    1. Remove the default logger (with logging level equals DEBUG) and add a new one with the desired logging level.

Tips on Fbs

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

https://build-system.fman.io/

https://github.com/mherrmann/fbs-tutorial

New Features in Spark 3

AQE (Adaptive Query Execution)

To enable AQE, you have to set spark.sql.adaptive.enabled to true (using --conf spark.sql.adaptive.enabled=true in spark-submit or using `spark.config("spark.sql.adaptive,enabled", "true") in Spark/PySpark code.)

Pandas UDFs

Pandas UDFs are user defined functions that are executed by Spark using Arrow to transfer data to Pandas to work with the data, which allows vectorized operations. A Pandas UDF is defined using pandas_udf