gpt4 book ai didi

java - 在Spring中使用2个相同类型的bean : javax. sql.DataSource

转载 作者:IT老高 更新时间:2023-10-28 13:53:32 25 4
gpt4 key购买 nike

我正在开发一个基于 Spring Boot 的应用程序,我想在其中创建 2 个 bean:一个将指向“Oracle”数据库;另一个将指向 Hive。我已将它们声明如下:

public @Bean
BoneCPDataSource metadataDataSource() {
BoneCPDataSource boneCPDataSource = new BoneCPDataSource();
boneCPDataSource.setDriverClass(getDriver());
boneCPDataSource.setJdbcUrl(getJdbcUrl());
boneCPDataSource.setUser(getUser());
boneCPDataSource.setPassword(getPassword());
boneCPDataSource.setMaxConnectionsPerPartition(5);
boneCPDataSource.setPartitionCount(5);
return boneCPDataSource;
}

public @Bean
BasicDataSource hiveDataSource() {
BasicDataSource basicDataSource = new BasicDataSource();

// Note: In a separate command window, use port forwarding like this:
//
// ssh -L 127.0.0.1:9996:<server>:<port> -l <userid> <server>
//
// and then login as the generic user.

basicDataSource.setDriverClassName("org.apache.hadoop.hive.jdbc.HiveDriver");
basicDataSource.setUrl("jdbc:hive://127.0.0.1:9996:10000/mytable");
return basicDataSource;
}

问题是在启动时我得到这个:

Exception in thread "main"
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration':
Injection of autowired dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not
autowire field: private javax.sql.DataSource
org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration.dataSource;
nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No
qualifying bean of type [javax.sql.DataSource] is defined: expected
single matching bean but found 2: metadataDataSource,hiveDataSource

主要是因为它们都继承自 javax.sql.DataSource。解决此问题的最佳方法是什么?


编辑:

现在我将它们声明如下:

public @Bean (name="metadataDataSource")
BoneCPDataSource metadataDataSource() {
BoneCPDataSource boneCPDataSource = new BoneCPDataSource();
boneCPDataSource.setDriverClass(getDriver());
boneCPDataSource.setJdbcUrl(getJdbcUrl());
boneCPDataSource.setUser(getUser());
boneCPDataSource.setPassword(getPassword());
boneCPDataSource.setMaxConnectionsPerPartition(5);
boneCPDataSource.setPartitionCount(5);
return boneCPDataSource;
}

public @Bean (name="hiveDataSource")
BasicDataSource hiveDataSource() {
BasicDataSource basicDataSource = new BasicDataSource();

// Note: In a separate command window, use port forwarding like this:
//
// ssh -L 127.0.0.1:9996:<server>:<port> -l <userid> <server>
//
// and then login as the generic user.

basicDataSource.setDriverClassName("org.apache.hadoop.hive.jdbc.HiveDriver");
basicDataSource.setUrl("jdbc:hive://127.0.0.1:9996:10000/mytable");
return basicDataSource;
}

得到了这个异常:

Exception in thread "main"
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration':
Injection of autowired dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not
autowire field: private javax.sql.DataSource
org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration.dataSource;
nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No
qualifying bean of type [javax.sql.DataSource] is defined: expected
single matching bean but found 2: metadataDataSource,hiveDataSource

其他类对这些bean的引用如下:

公共(public)类 MetadataProcessorImpl 实现 MetadataProcessor {

@Autowired
@Qualifier("metadataDataSource")
BoneCPDataSource metadataDataSource;

@ Controller 公共(public)类 HiveController {

@Autowired
@Qualifier("hiveDataSource")
BasicDataSource hiveDataSource;

最佳答案

将其中一个光束设置为@Primary如67.2 配置两个数据源

部分所述

http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#howto-two-datasources

关于java - 在Spring中使用2个相同类型的bean : javax. sql.DataSource,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23687369/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com