Skip to content
Snippets Groups Projects
Commit 3f634551 authored by Benoit Socias's avatar Benoit Socias Committed by Romain Derie
Browse files

[FIX] website: do not suggest generic pages for existing specific ones


Before this commit, when obtaining link URL suggestions, both the
specific and the matching generic page were suggested.

After this commit, only the most specific ones are kept in the suggested
list.
This commit also adapts the sitemap in the same way.
In stable, a condition on a dedicated context key is used in case those
methods were called with the goal of obtaining both generic and specific
pages.
In 16.0, those methods will always filter duplicates pages as it was
supposed at first.

Steps to reproduce:
- Edit Contact Us page (to create a specific view)
- Edit the Contact Us menu
- Type "/" in the URL
=> "/contactus" appeared twice.

task-2968292

closes odoo/odoo#111603

Signed-off-by: default avatarRomain Derie (rde) <rde@odoo.com>
parent 1a24477f
Branches
Tags
No related merge requests found
......@@ -201,7 +201,7 @@ class Website(Home):
sitemaps.unlink()
pages = 0
locs = request.website.with_user(request.website.user_id)._enumerate_pages()
locs = request.website.with_context(_filter_duplicate_pages=True).with_user(request.website.user_id)._enumerate_pages()
while True:
values = {
'locs': islice(locs, 0, LOC_PER_SITEMAP),
......@@ -264,7 +264,7 @@ class Website(Home):
current_website = request.website
matching_pages = []
for page in current_website.search_pages(needle, limit=int(limit)):
for page in current_website.with_context(_filter_duplicate_pages=True).search_pages(needle, limit=int(limit)):
matching_pages.append({
'value': page['loc'],
'label': 'name' in page and '%s (%s)' % (page['loc'], page['name']) or page['loc'],
......@@ -272,7 +272,7 @@ class Website(Home):
matching_urls = set(map(lambda match: match['value'], matching_pages))
matching_last_modified = []
last_modified_pages = current_website._get_website_pages(order='write_date desc', limit=5)
last_modified_pages = current_website.with_context(_filter_duplicate_pages=True)._get_website_pages(order='write_date desc', limit=5)
for url, name in last_modified_pages.mapped(lambda p: (p.url, p.name)):
if needle.lower() in name.lower() or needle.lower() in url.lower() and url not in matching_urls:
matching_last_modified.append({
......
......@@ -892,6 +892,9 @@ class Website(models.Model):
domain = []
domain += self.get_current_website().website_domain()
pages = self.env['website.page'].sudo().search(domain, order=order, limit=limit)
# TODO In 16.0 remove condition on _filter_duplicate_pages.
if self.env.context.get('_filter_duplicate_pages'):
pages = pages.filtered(pages._is_most_specific_page)
return pages
def search_pages(self, needle=None, limit=None):
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment