Add unique constraints safely in SQLAlchemy
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Install this rule for wispbit
Add this rule to wispbit and it will run when you open a pull request
Install this rule for Coderabbit
Copy the configuration below and add it to your repository as .coderabbit.yml
in your project root.
reviews:
path_instructions:
- path: "*.py"
instructions: |
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
Install this rule for Greptile
Greptile rules can be added through the web interface. Please see this documentation for details on how to add custom rules and context.
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
File Path Patterns:
Install this rule for GitHub Copilot
Copilot instructions can be added through the interface. See the documentation for details on how to create coding guidelines.
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
File Path Patterns:
Install this rule for Graphite Diamond
Diamond custom rules can be added through the interface. See the documentation for details on how to create custom rules.
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
File Path Patterns:
Use with Cline
Copy the rule below and ask Cline to review your code using this rule
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
Use with OpenAI Codex
Copy the rule below and ask OpenAI Codex to review your code using this rule
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
Use with Cursor
Copy the rule below and ask Cursor to review your code using this rule
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
Use with Claude Code
Copy the rule below and ask Claude Code to review your code using this rule
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```
Install this rule for Windsurf
To set up rules for Windsurf Reviews, please see this documentation
When adding unique constraints that could affect large tables, create the unique index concurrently first to avoid blocking reads and writes during the migration.
Bad:
```python
def upgrade():
# Directly creating a unique constraint can block reads and writes
op.create_unique_constraint('users_email_unique', 'users', ['email'])
```
Good:
```python
# Migration 1: Create unique index concurrently
def upgrade():
# Create the unique index concurrently (non-blocking)
op.create_index(
'users_email_unique_idx',
'users',
['email'],
unique=True,
postgresql_concurrently=True
)
```
```python
# Migration 2: Add constraint using existing index
def upgrade():
# Add the unique constraint using the existing index
op.create_unique_constraint(
'users_email_unique',
'users',
['email'],
postgresql_using_index='users_email_unique_idx'
)
```