Ошибка при использовании Autowired

spring hibernate apache-spark dependency-injection autowired

283 просмотра

1 ответ

Я пытаюсь создать базу проекта на основе Spark. Но я не знаю, как использовать Dependency Injection, поэтому я добавляю зависимости ядра Spring и Spring Spring для использования DI.

Я использую Hibernate для подключения к БД, но это ошибка, когда я наложен слой Autowired Repository.

Это мой код:

Файл pom.xml:

<dependencies>
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-core</artifactId>
        <version>2.1</version>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-core</artifactId>
        <version>2.5.1</version>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        <version>2.5.1</version>
    </dependency>
    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>18.0</version>
    </dependency>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.21</version>
    </dependency>
    <dependency>
        <groupId>org.sql2o</groupId>
        <artifactId>sql2o</artifactId>
        <version>1.5.4</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
    <dependency>
        <groupId>mysql</groupId>
        <artifactId>mysql-connector-java</artifactId>
        <version>5.1.38</version>
    </dependency>
    <dependency>
        <groupId>com.beust</groupId>
        <artifactId>jcommander</artifactId>
        <version>1.48</version>
    </dependency>
    <dependency>
        <groupId>net.sf.derquinsej</groupId>
        <artifactId>derquinsej-core</artifactId>
        <version>0.0.2</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.3.0</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.springframework/spring-core -->
    <dependency>
        <groupId>org.springframework</groupId>
        <artifactId>spring-core</artifactId>
        <version>4.2.5.RELEASE</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.springframework/spring-context -->
    <dependency>
        <groupId>org.springframework</groupId>
        <artifactId>spring-context</artifactId>
        <version>4.2.5.RELEASE</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.springframework.data/spring-data-jpa -->
    <dependency>
        <groupId>org.springframework.data</groupId>
        <artifactId>spring-data-jpa</artifactId>
        <version>1.10.1.RELEASE</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.hibernate/hibernate-entitymanager -->
    <dependency>
        <groupId>org.hibernate</groupId>
        <artifactId>hibernate-entitymanager</artifactId>
        <version>5.1.0.Final</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.springframework/spring-beans -->
    <dependency>
        <groupId>org.springframework</groupId>
        <artifactId>spring-beans</artifactId>
        <version>4.2.5.RELEASE</version>
    </dependency>

</dependencies>

Конфигурация файла с Hibernate:

@Configuration
@ComponentScan({"com.higgsup.internship.spark"})
@EnableTransactionManagement
@EnableJpaRepositories(basePackages = "com.higgsup.internship.repository")
public class DatabaseConfig {

@Autowired
private Environment environment;

@Bean
public DataSource dataSource() {
    DriverManagerDataSource dataSource = new DriverManagerDataSource();
    dataSource.setDriverClassName("com.mysql.jdbc.Driver");
    dataSource.setUrl("jdbc:mysql://localhost:3306/test");
    dataSource.setUsername("root");
    dataSource.setPassword("");
    return dataSource;
}

@Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
    LocalContainerEntityManagerFactoryBean entityManagerFactory = new LocalContainerEntityManagerFactoryBean();
    entityManagerFactory.setDataSource(dataSource());
    entityManagerFactory.setPackagesToScan("com.higgsup.internship.spark.model");

    HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
    entityManagerFactory.setJpaVendorAdapter(vendorAdapter);

    Properties additionalProperties = new Properties();
    additionalProperties.put("hibernate.dialect", "org.hibernate.dialect.MySQLDialect");
    additionalProperties.put("hibernate.ejb.naming_strategy", "org.hibernate.cfg.ImprovedNamingStrategy");
    additionalProperties.put("hibernate.show_sql", "true");
    additionalProperties.put("hibernate.hbm2ddl.auto", "create-drop");
    entityManagerFactory.setJpaProperties(additionalProperties);

    return entityManagerFactory;
}

@Bean
public JpaTransactionManager transactionManager() {
    return new JpaTransactionManager(entityManagerFactory().getObject());
}

@Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
    return new PersistenceExceptionTranslationPostProcessor();
}
}`

Файл UserRepository:

@Repository
public interface UserRepository extends CrudRepository<User, Integer> {
}

Файл UserService (ошибка здесь, когда Autowired):

@Service
public class UserService {
@Autowired
UserRepository userRepository;

@Bean
public User user(){
    return new User("hung");
}
}

Файловое приложение:

@ComponentScan
public class Application {

private static final int PORT = 1098;

public static String dataToJson(Object data) {
    try {
        ObjectMapper mapper = new ObjectMapper();
        mapper.enable(SerializationFeature.INDENT_OUTPUT);
        StringWriter sw = new StringWriter();
        mapper.writeValue(sw, data);
        return sw.toString();
    } catch (IOException e){
        throw new RuntimeException("IOException from a StringWriter?");
    }
}

public static void main(String[] args) {
    AnnotationConfigApplicationContext ctx = new    AnnotationConfigApplicationContext(Application.class);
    new Controller();

    try {
        LocateRegistry.createRegistry(1099 );
        Calculator c = new CalculatorImpl();
        Naming.bind("rmi://localhost:1099/CalculatorService", c);
    } catch (Exception e) {
        System.out.println("Trouble: " + e);
    }
}
}

Это журнал, когда я запускаю:

>     SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in
> [jar:file:/C:/Users/ASUS/.m2/repository/org/slf4j/slf4j-simple/1.7.21/slf4j-simple-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in
> [jar:file:/C:/Users/ASUS/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation. SLF4J: Actual binding is of type
> [org.slf4j.impl.SimpleLoggerFactory] [main] INFO
> org.springframework.context.annotation.AnnotationConfigApplicationContext
> - Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@128c152:
> startup date [Sat Aug 20 21:52:51 ICT 2016]; root of context hierarchy
> [main] INFO
> org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor
> - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring [main] INFO
> org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker
> - Bean 'databaseConfig' of type [class com.higgsup.internship.spark.config.DatabaseConfig$$EnhancerBySpringCGLIB$$f1472d2c]
> is not eligible for getting processed by all BeanPostProcessors (for
> example: not eligible for auto-proxying) [main] INFO
> org.springframework.jdbc.datasource.DriverManagerDataSource - Loaded
> JDBC driver: com.mysql.jdbc.Driver log4j:WARN No appenders could be
> found for logger (org.jboss.logging). log4j:WARN Please initialize the
> log4j system properly. log4j:WARN See
> http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> [main] INFO
> org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean -
> Building JPA container EntityManagerFactory for persistence unit
> 'default' Hibernate: drop table if exists hibernate_sequence
> Hibernate: drop table if exists User Hibernate: create table
> hibernate_sequence (next_val bigint) Hibernate: insert into
> hibernate_sequence values ( 1 ) Hibernate: create table User (id
> integer not null, userName varchar(255), primary key (id)) [main] WARN
> org.springframework.context.annotation.AnnotationConfigApplicationContext
> - Exception encountered during context initialization - cancelling refresh attempt:
> org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name 'userService': Injection of autowired
> dependencies failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Could not
> autowire field: com.higgsup.internship.spark.repository.UserRepository
> com.higgsup.internship.spark.service.UserService.userRepository;
> nested exception is
> org.springframework.beans.factory.NoSuchBeanDefinitionException: No
> qualifying bean of type
> [com.higgsup.internship.spark.repository.UserRepository] found for
> dependency: expected at least 1 bean which qualifies as autowire
> candidate for this dependency. Dependency annotations:
> {@org.springframework.beans.factory.annotation.Autowired(required=true)}
> [main] INFO
> org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean -
> Closing JPA EntityManagerFactory for persistence unit 'default'
> Hibernate: drop table if exists hibernate_sequence Hibernate: drop
> table if exists User Exception in thread "main"
> org.springframework.beans.factory.BeanCreationException: Error
> creating bean with name 'userService': Injection of autowired
> dependencies failed; nested exception is
> org.springframework.beans.factory.BeanCreationException: Could not
> autowire field: com.higgsup.internship.spark.repository.UserRepository
> com.higgsup.internship.spark.service.UserService.userRepository;
> nested exception is
> org.springframework.beans.factory.NoSuchBeanDefinitionException: No
> qualifying bean of type
> [com.higgsup.internship.spark.repository.UserRepository] found for
> dependency: expected at least 1 bean which qualifies as autowire
> candidate for this dependency. Dependency annotations:
> {@org.springframework.beans.factory.annotation.Autowired(required=true)}
>   at
> org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:334)
>   at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1214)
>   at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543)
>   at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
>   at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
>   at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
>   at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
>   at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>   at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:772)
>   at
> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:839)
>   at
> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:538)
>   at
> org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:84)
>   at com.higgsup.internship.spark.Application.main(Application.java:39)
> Caused by: org.springframework.beans.factory.BeanCreationException:
> Could not autowire field:
> com.higgsup.internship.spark.repository.UserRepository
> com.higgsup.internship.spark.service.UserService.userRepository;
> nested exception is
> org.springframework.beans.factory.NoSuchBeanDefinitionException: No
> qualifying bean of type
> [com.higgsup.internship.spark.repository.UserRepository] found for
> dependency: expected at least 1 bean which qualifies as autowire
> candidate for this dependency. Dependency annotations:
> {@org.springframework.beans.factory.annotation.Autowired(required=true)}
>   at
> org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:573)
>   at
> org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:88)
>   at
> org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:331)
>   ... 12 more Caused by:
> org.springframework.beans.factory.NoSuchBeanDefinitionException: No
> qualifying bean of type
> [com.higgsup.internship.spark.repository.UserRepository] found for
> dependency: expected at least 1 bean which qualifies as autowire
> candidate for this dependency. Dependency annotations:
> {@org.springframework.beans.factory.annotation.Autowired(required=true)}
>   at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoSuchBeanDefinitionException(DefaultListableBeanFactory.java:1373)
>   at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1119)
>   at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1014)
>   at
> org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:545)
>   ... 14 more Disconnected from the target VM, address:
> '127.0.0.1:62605', transport: 'socket'
> 
> Process finished with exit code 1
Автор: Hưng Híp Источник Размещён: 08.11.2019 11:28

Ответы (1)


1 плюс

Это полное имя класса из ваших журналов: com.higgsup.internship.spark.repository.UserRepository

и это имя пакета, которое вы передаете своему @EnableJpaRepositories:
com.higgsup.internship.repository

Измените вашу @EnableJpaRepositoriesконфигурацию на: @EnableJpaRepositories(basePackages = "com.higgsup.internship.spark.repository")

Автор: Maciej Marczuk Размещён: 20.08.2016 03:22
Вопросы из категории :
32x32