Artificial intelligence (AI) technology is deeply reshaping how culture is communicated, creating both chances to improve efficiency and reach a global audience, but also raising concerns about the loss of authentic culture and biased algorithms. This work builds a three-part structure—technology support-culture fit-system regulation—combining Hofstede's cultural dimensions and new institutionalism to study how generative AI performs in understanding cultural symbols and in cross-cultural storytelling. An examination of how the ChatGPT model series interprets Chinese traditional symbols shows that current AI systems have limits, like oversimplifying culture (for example, linking the dragon symbol too closely with royal power) and creating logical conflicts (the differing Western and Chinese views of the phoenix). It also shows that the cultural alignment of GPT-4 in Chinese (68%) is much better than GPT-3.5 (42%). This research suggests a cultural digital governance approach: creating diverse cultural knowledge banks, creating cultural sensitivity assessment measurements, and applying graded cooperation between people and machines. This provides a source for balancing tech progress and cultural heritage.