- Let’s assume that we have two entities,
Author
andBook
. There is no materialized association between them, but, both entities shares an attribute named,genre
. We want to use this attribute to join the tables corresponding toAuthor
andBook
, and fetch the result in a DTO. The result should contain theAuthor
entity and only thetitle
attribute fromBook
. Well, when you are in a scenario as here, it is strongly advisable to avoid fetching the DTO via constructor expression. This approach cannot fetch the data in a singleSELECT
, and is prone to N+1. Way better than this consists of using Spring projections, JPATuple
or even HibernateResultTransformer
. These approaches will fetch the data in a singleSELECT
. This application is a DON'T DO THIS example. Check the number of queries needed for fetching the data - Spring
deleteAllInBatch()
anddeleteInBatch()
don't use delete batching and don't take advantage of cascading removal,orphanRemoval
and automatic optimistic locking mechanism to prevent lost updates (e.g.,@Version
is ignored), but both of them take advantage onON DELETE CASCADE
and are very efficient. They trigger bulk operations viaQuery.executeUpdate()
, therefore, the Persistence Context is not synchronized accordingly (it's up to you to flush (before delete) and close/clear (after delete) the Persistence Context accordingly to avoid issues created by unflushed (if any) or outdated (if any) entities). The first one simply triggers adelete from entity_name
statement, while the second one triggers adelete from entity_name where id=? or id=? or id=? ...
statement. For delete in batches rely ondeleteAll()
,deleteAll(Iterable<? extends T> entities)
ordelete()
method. Behind the scene, the two flavors ofdeleteAll()
relies ondelete()
. Mixing batching with database automatic actions (ON DELETE CASCADE
) will result in a partially synchronized Persistent Context.