Describe the issue:I have completed the test model inference and received the dialogue response from the model. However, after setting up the AI search flow and adding sample data, when I click on the ML Inference Processor, this message appears. Did I miss any configuration?
Configuration:Create connector for DeepSeek Chat,Create model group,Register model to model group & deploy model,Test model inference,I can receive responses from DeepSeek and converse with it.in machine learning,DeepSeek Cha is Responding
Thank you for your answer, I have completed the registration. But I have a new question: What is an Embedding Model? I haven’t found any relevant documentation, and I am not a developer but a self-learner with no prior knowledge. Could you provide me with some relevant clues? Thank you, and I wish you a great day!