You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Per my understand, the microservice layer is to define consistent/well-defined APIs to hide the difference of implementations.
When a user is composing pipelines, he can switch between different backends with least effort.
So, for the dataprep(comps/dataprep) category, it's better to expose the same port.
The milvus and redis implementation are using different port: 6010 and 6007.
A grep shows that 6008 and 6009 are also used, maybe for different purpose.
Per my understand, the microservice layer is to define consistent/well-defined APIs to hide the difference of implementations.
When a user is composing pipelines, he can switch between different backends with least effort.
So, for the dataprep(comps/dataprep) category, it's better to expose the same port.
The milvus and redis implementation are using different port: 6010 and 6007.
A grep shows that 6008 and 6009 are also used, maybe for different purpose.
GenAIComps/comps/dataprep$ grep -R "port=" *
milvus/langchain/README.md:your_port=6010
milvus/langchain/prepare_doc_milvus.py:@register_microservice(name="opea_service@prepare_doc_milvus", endpoint="/v1/dataprep", host="0.0.0.0", port=6010)
milvus/langchain/prepare_doc_milvus.py: name="opea_service@prepare_doc_milvus", endpoint="/v1/dataprep/get_file", host="0.0.0.0", port=6010
milvus/langchain/prepare_doc_milvus.py: name="opea_service@prepare_doc_milvus", endpoint="/v1/dataprep/delete_file", host="0.0.0.0", port=6010
multimodal/redis/langchain/prepare_videodoc_redis.py: name="opea_service@prepare_videodoc_redis", endpoint="/v1/generate_transcripts", host="0.0.0.0", port=6007
multimodal/redis/langchain/prepare_videodoc_redis.py: name="opea_service@prepare_videodoc_redis", endpoint="/v1/generate_captions", host="0.0.0.0", port=6007
multimodal/redis/langchain/prepare_videodoc_redis.py: port=6007,
multimodal/redis/langchain/prepare_videodoc_redis.py: name="opea_service@prepare_videodoc_redis", endpoint="/v1/dataprep/get_videos", host="0.0.0.0", port=6007
multimodal/redis/langchain/prepare_videodoc_redis.py: name="opea_service@prepare_videodoc_redis", endpoint="/v1/dataprep/delete_videos", host="0.0.0.0", port=6007
neo4j/langchain/prepare_doc_neo4j.py: port=6007,
pgvector/langchain/prepare_doc_pgvector.py: connection = psycopg2.connect(database=database, user=username, password=password, host=hostname, port=port)
pgvector/langchain/prepare_doc_pgvector.py: port=6007,
pgvector/langchain/prepare_doc_pgvector.py: name="opea_service@prepare_doc_pgvector", endpoint="/v1/dataprep/get_file", host="0.0.0.0", port=6007
pgvector/langchain/prepare_doc_pgvector.py: name="opea_service@prepare_doc_pgvector", endpoint="/v1/dataprep/delete_file", host="0.0.0.0", port=6007
pinecone/langchain/prepare_doc_pinecone.py:@register_microservice(name="opea_service@prepare_doc_pinecone", endpoint="/v1/dataprep", host="0.0.0.0", port=6007)
pinecone/langchain/prepare_doc_pinecone.py: name="opea_service@prepare_doc_pinecone_file", endpoint="/v1/dataprep/get_file", host="0.0.0.0", port=6008
pinecone/langchain/prepare_doc_pinecone.py: name="opea_service@prepare_doc_pinecone_del", endpoint="/v1/dataprep/delete_file", host="0.0.0.0", port=6009
qdrant/langchain/prepare_doc_qdrant.py: port=QDRANT_PORT,
qdrant/langchain/prepare_doc_qdrant.py: port=6007,
redis/README.md:your_port=6006
redis/langchain_ray/prepare_doc_redis_on_ray.py:@register_microservice(name="opea_service@prepare_doc_redis", endpoint="/v1/dataprep", host="0.0.0.0", port=6007)
redis/langchain_ray/prepare_doc_redis_on_ray.py: name="opea_service@prepare_doc_redis_file", endpoint="/v1/dataprep/get_file", host="0.0.0.0", port=6008
redis/langchain_ray/prepare_doc_redis_on_ray.py: name="opea_service@prepare_doc_redis_del", endpoint="/v1/dataprep/delete_file", host="0.0.0.0", port=6009
redis/llama_index/prepare_doc_redis.py:@register_microservice(name="opea_service@prepare_doc_redis", endpoint="/v1/dataprep", host="0.0.0.0", port=6007)
redis/llama_index/prepare_doc_redis.py: name="opea_service@prepare_doc_redis_file", endpoint="/v1/dataprep/get_file", host="0.0.0.0", port=6008
redis/llama_index/prepare_doc_redis.py: name="opea_service@prepare_doc_redis_del", endpoint="/v1/dataprep/delete_file", host="0.0.0.0", port=6009
redis/langchain/prepare_doc_redis.py:@register_microservice(name="opea_service@prepare_doc_redis", endpoint="/v1/dataprep", host="0.0.0.0", port=6007)
redis/langchain/prepare_doc_redis.py: name="opea_service@prepare_doc_redis", endpoint="/v1/dataprep/get_file", host="0.0.0.0", port=6007
redis/langchain/prepare_doc_redis.py: name="opea_service@prepare_doc_redis", endpoint="/v1/dataprep/delete_file", host="0.0.0.0", port=6007
vdms/multimodal_langchain/ingest_videos.py:@register_microservice(name="opea_service@prepare_videodoc_vdms", endpoint="/v1/dataprep", host="0.0.0.0", port=6007)
vdms/multimodal_langchain/ingest_videos.py: port=6007,
vdms/multimodal_langchain/ingest_videos.py: port=6007,
vdms/multimodal_langchain/utils/store_embeddings.py: self.client = VDMS_Client(host=self.host, port=self.port)
vdms/langchain/prepare_doc_vdms.py: port=6007,
The text was updated successfully, but these errors were encountered: