-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Fix image to text node, it was bugged #44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Dependency Review✅ No vulnerabilities or license issues or OpenSSF Scorecard issues found.OpenSSF Scorecard
Scanned Manifest Files |
|
||
Methods: | ||
execute(state, url): Execute the node's logic and return the updated state. | ||
""" | ||
|
||
def __init__(self, llm, node_name: str): | ||
def __init__(self, input: str, output: List[str], model_config: dict, | ||
node_name: str = "GetProbableTags"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
change default node_name
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
did it
""" | ||
super().__init__(node_name, "node") | ||
self.llm = llm | ||
super().__init__(node_name, "node", input, output, 2, model_config) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should accept only 1 input which is the url or list of url present in the state
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
did it
print("---GENERATING TEXT FROM IMAGE---") | ||
text_answer = self.llm.run(url) | ||
text_answer = self.llm_model.run(url) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
missing the retrieval part from the state before this line to get the url
super().__init__(node_name, "node") | ||
self.llm = llm | ||
super().__init__(node_name, "node", input, output, 2, model_config) | ||
self.llm_model = model_config["llm_model"] | ||
|
||
def execute(self, state: dict, url: str) -> dict: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
here we can remove the url arg since we use our graph syntax for state retrieval
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are right
@@ -71,8 +71,7 @@ def _create_llm(self, llm_config: dict): | |||
return OpenAI(llm_params) | |||
elif "gemini" in llm_params["model"]: | |||
return Gemini(llm_params) | |||
else: | |||
raise ValueError("Model not supported") | |||
raise ValueError("Model not supported") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nope
No description provided.