Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR Error executing verb "text_embed" in create_final_entities: 'NoneType' object is not iterable #7

Closed
Ikaros-521 opened this issue Jul 22, 2024 · 1 comment

Comments

@Ikaros-521
Copy link
Owner

17:01:42,23 httpcore.connection DEBUG close.complete
17:01:42,23 openai._base_client DEBUG HTTP Request: POST http://127.0.0.1:11434/api/embeddings "200 OK"
17:01:42,25 graphrag.index.reporting.file_workflow_callbacks INFO Error Invoking LLM details={'input': ['"ALICE ARAUJO MARQUES DE S\xc1":"Alice Araujo Marques de S�� is the lead researcher of a study on biomimetics and design.")', '"BIOMIMETICS 2023":"Biomimetics 2023" refers to a journal or event that focuses on interdisciplinary knowledge and creativity, highlighting the promotion of interconnections between various fields.', '"CREATIVE ACTIVITIES":"Creative activities" are described as adopting diverse and integrated knowledge, showcasing how they utilize various spheres for their performance.', '"INTERDISCIPLINARY KNOWLEDGE":"Interdisciplinary knowledge" is the foundation upon which creative activities operate and interconnections between different fields are promoted.', '"STANISLAV N. GORB":"Stanislav N. Gorb" is not only an Academic Editor but also a renowned researcher in the field of biomechanics with significant contributions to the understanding of animal locomotion, making him a key figure in Biomimetics 2023.""Stanislav N. Gorb" acts as an Academic Editor for Biomimetics 2023, indicating his role in overseeing scholarly content within the interdisciplinary field of biomimetics."', '"ACADEMIC EDITOR":', '"BIOMIMETICS SOCIETY":"Biomimetics Society" is a global organization that aims to promote and advance the field of biomimetics through research, education, and collaboration."', '"SUPPORTING BODY":', '"CONTRIBUTOR":']}
17:01:42,25 datashaper.workflow.workflow ERROR Error executing verb "text_embed" in create_final_entities: 'NoneType' object is not iterable
Traceback (most recent call last):
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\datashaper\workflow\workflow.py", line 415, in _execute_verb
    result = await result
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\text_embed.py", line 105, in text_embed
    return await _text_embed_in_memory(
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\text_embed.py", line 130, in _text_embed_in_memory
    result = await strategy_exec(texts, callbacks, cache, strategy_args)
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\strategies\openai.py", line 62, in run
    embeddings = await _execute(llm, text_batches, ticker, semaphore)
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\strategies\openai.py", line 106, in _execute
    results = await asyncio.gather(*futures)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\asyncio\tasks.py", line 304, in __wakeup
    future.result()
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\asyncio\tasks.py", line 232, in __step
    result = coro.send(None)
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\strategies\openai.py", line 100, in embed
    chunk_embeddings = await llm(chunk)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\caching_llm.py", line 104, in __call__
    result = await self._delegate(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 177, in __call__
    result, start = await execute_with_retry()
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 159, in execute_with_retry
    async for attempt in retryer:
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\asyncio\__init__.py", line 166, in __anext__
    do = await self.iter(retry_state=self._retry_state)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\asyncio\__init__.py", line 153, in iter
    result = await action(retry_state)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\_utils.py", line 99, in inner
    return call(*args, **kwargs)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\__init__.py", line 398, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\concurrent\futures\_base.py", line 451, in result
    return self.__get_result()
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\concurrent\futures\_base.py", line 403, in __get_result
    raise self._exception
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 165, in execute_with_retry
    return await do_attempt(), start
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 147, in do_attempt
    return await self._delegate(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\base_llm.py", line 49, in __call__
    return await self._invoke(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\base_llm.py", line 53, in _invoke
    output = await self._execute_llm(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\openai\openai_embeddings_llm.py", line 36, in _execute_llm
    embedding = await self.client.embeddings.create(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\resources\embeddings.py", line 215, in create
    return await self._post(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1826, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1519, in request
    return await self._request(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1622, in _request
    return await self._process_response(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1714, in _process_response
    return await api_response.parse()
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_response.py", line 419, in parse
    parsed = self._options.post_parser(parsed)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\resources\embeddings.py", line 203, in parser
    for embedding in obj.data:
TypeError: 'NoneType' object is not iterable
17:01:42,28 graphrag.index.reporting.file_workflow_callbacks INFO Error executing verb "text_embed" in create_final_entities: 'NoneType' object is not iterable details=None
17:01:42,28 graphrag.index.run ERROR error running workflow create_final_entities
Traceback (most recent call last):
  File "F:\GraphRAG-Ollama-UI\graphrag\index\run.py", line 323, in run_pipeline
    result = await workflow.run(context, callbacks)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\datashaper\workflow\workflow.py", line 369, in run
    timing = await self._execute_verb(node, context, callbacks)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\datashaper\workflow\workflow.py", line 415, in _execute_verb
    result = await result
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\text_embed.py", line 105, in text_embed
    return await _text_embed_in_memory(
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\text_embed.py", line 130, in _text_embed_in_memory
    result = await strategy_exec(texts, callbacks, cache, strategy_args)
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\strategies\openai.py", line 62, in run
    embeddings = await _execute(llm, text_batches, ticker, semaphore)
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\strategies\openai.py", line 106, in _execute
    results = await asyncio.gather(*futures)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\asyncio\tasks.py", line 304, in __wakeup
    future.result()
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\asyncio\tasks.py", line 232, in __step
    result = coro.send(None)
  File "F:\GraphRAG-Ollama-UI\graphrag\index\verbs\text\embed\strategies\openai.py", line 100, in embed
    chunk_embeddings = await llm(chunk)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\caching_llm.py", line 104, in __call__
    result = await self._delegate(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 177, in __call__
    result, start = await execute_with_retry()
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 159, in execute_with_retry
    async for attempt in retryer:
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\asyncio\__init__.py", line 166, in __anext__
    do = await self.iter(retry_state=self._retry_state)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\asyncio\__init__.py", line 153, in iter
    result = await action(retry_state)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\_utils.py", line 99, in inner
    return call(*args, **kwargs)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\tenacity\__init__.py", line 398, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\concurrent\futures\_base.py", line 451, in result
    return self.__get_result()
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\concurrent\futures\_base.py", line 403, in __get_result
    raise self._exception
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 165, in execute_with_retry
    return await do_attempt(), start
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\rate_limiting_llm.py", line 147, in do_attempt
    return await self._delegate(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\base_llm.py", line 49, in __call__
    return await self._invoke(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\base\base_llm.py", line 53, in _invoke
    output = await self._execute_llm(input, **kwargs)
  File "F:\GraphRAG-Ollama-UI\graphrag\llm\openai\openai_embeddings_llm.py", line 36, in _execute_llm
    embedding = await self.client.embeddings.create(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\resources\embeddings.py", line 215, in create
    return await self._post(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1826, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1519, in request
    return await self._request(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1622, in _request
    return await self._process_response(
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_base_client.py", line 1714, in _process_response
    return await api_response.parse()
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\_response.py", line 419, in parse
    parsed = self._options.post_parser(parsed)
  File "f:\GraphRAG-Ollama-UI\Miniconda3\lib\site-packages\openai\resources\embeddings.py", line 203, in parser
    for embedding in obj.data:
TypeError: 'NoneType' object is not iterable
17:01:42,29 graphrag.index.reporting.file_workflow_callbacks INFO Error running pipeline! details=None
@Ikaros-521
Copy link
Owner Author

Ikaros-521 commented Jul 22, 2024

参考 microsoft/graphrag#339
主要还是和llm embedding模型配置有关(本地ollama兼容性问题还是很多)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant