Detecting semantic types of columns in data lake tables is an impor-
tant application. A key bottleneck in semantic type detection is the
availability of human annotation due to the inherent complexity of
data lakes. In this paper, we propose using programmatic weak su-
pervision to assist in annotating the training data for semantic type
detection by leveraging labeling functions. One challenge in this
process is the difficulty of manually writing labeling functions due
to the large volume and low quality of the data lake table datasets.
To address this issue, we explore employing Large Language Mod-
els (LLMs) for labeling function generation and introduce several
prompt engineering strategies for this purpose. We conduct experi-
ments on real-world web table datasets. Based on the initial results,
we perform extensive analysis and provide empirical insights and
future directions for researchers in this field.