I am trying to write a service which should be also capable to search across the referenced model. i.e,
async getAllSubmissions(
paginationDto: PaginationDto,
sortDto: SortDto,
searchDto: SearchDto,
): Promise<{
submissions: SubmissionDocument[];
pagination: PaginationPageInfo;
}> {
const limit = parseInt(paginationDto.limit?.toString() || '10', 10);
const skip = parseInt(paginationDto.skip?.toString() || '0', 10);
const sortField = sortDto.field || '';
const sortDesc = sortDto.desc || false;
try {
let query = this.submissionModel.find();
query = query
.sort({ updatedAt: -1 })
.populate('candidate')
.populate('role')
.populate('recruiter');
if (sortField) {
if (sortDesc) {
query = query.sort({ [sortField]: -1 });
} else {
query = query.sort({ [sortField]: 1 });
}
}
if (searchDto.search) {
const searchTerm = new RegExp(searchDto.search, 'i');
query = query.or([
{ name: { $regex: searchTerm } },
{ 'candidate.firstName': { $regex: searchTerm } },
{ 'candidate.lastName': { $regex: searchTerm } },
{ 'role.role': { $regex: searchTerm } },
]);
}
// use clone() to avoid modifying the original query for pagination
const countQuery = query.clone().countDocuments();
const [submissions, total] = await Promise.all([
query.skip(skip).limit(Number(limit)).exec(),
countQuery,
]);
const hasNextPage = total > skip + submissions.length;
const hasPreviousPage = skip > 0;
return {
submissions: submissions,
pagination: { totalCount: total, hasNextPage, hasPreviousPage },
};
} catch (error) {
console.error(error);
throw new InternalServerErrorException('Failed to fetch submissions.');
}
}
That is how it currently is, I just wrote this and hoped for it to work, now obviously this does not work.
and then after looking up across the mongodb docs, i came across a way to query the referenced models using the lookup method, which i am currently doing in here
async getAllSubmissions(
paginationDto: PaginationDto,
sortDto: SortDto,
searchDto: SearchDto,
): Promise<{
submissions: SubmissionDocument[];
pagination: PaginationPageInfo;
}> {
const limit = parseInt(paginationDto.limit?.toString() || '10', 10);
const skip = parseInt(paginationDto.skip?.toString() || '0', 10);
const sortField = sortDto.field || '';
const sortDesc = sortDto.desc || false;
try {
let query = this.submissionModel.find();
query = query.sort({ updatedAt: -1 }).populate('candidate').populate('role').populate('recruiter');
if (sortField) {
query = query.sort({ [sortField]: sortDesc ? -1 : 1 });
}
if (searchDto.search) {
const searchTerm = new RegExp(searchDto.search, 'i');
const searchPipeline = [
{
$lookup: {
from: 'candidates',
localField: 'candidate',
foreignField: '_id',
as: 'candidateData',
},
},
{ $unwind: { path: '$candidateData', preserveNullAndEmptyArrays: true } },
{
$lookup: {
from: 'roles',
localField: 'role',
foreignField: '_id',
as: 'roleData',
},
},
{ $unwind: { path: '$roleData', preserveNullAndEmptyArrays: true } },
{
$match: {
$or: [
{ name: { $regex: searchTerm } },
{ 'candidateData.firstName': { $regex: searchTerm } },
{ 'candidateData.lastName': { $regex: searchTerm } },
{ 'roleData.role': { $regex: searchTerm } },
],
},
},
];
const aggregationResults = await this.submissionModel.aggregate(searchPipeline);
const matchedIds = aggregationResults.map((doc) => doc._id);
query = query.where('_id').in(matchedIds);
}
const countQuery = query.clone().countDocuments();
const [submissions, total] = await Promise.all([
query.skip(skip).limit(Number(limit)).exec(),
countQuery,
]);
return {
submissions,
pagination: { totalCount: total, hasNextPage: total > skip + submissions.length, hasPreviousPage: skip > 0 },
};
} catch (error) {
console.error(error);
throw new InternalServerErrorException('Failed to fetch submissions.');
}
}
which also dont work, now i understand that i am trying to force a sql like behaviour from a nosql database, but during the planning phase we just went on with this expecting that it will be easier to implement and wont end up complex.
i am not sure how the referenced model works generally, i have never gone deeper, would really appreciate if you all can point out what i might be doing wrong and hopefully we solve this issue.
it might be really infuriating because i am not using aggregation pipeline, but i felt using a builder-patternesque approach for this is easier and, the data that we are handling is not much, as this is an internal project, and we can get done with this easily.
Thank you,