Only concurrent indexes in SQLAlchemy
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
Install this rule for wispbit
Add this rule to wispbit and it will run when you open a pull request
Install this rule for Coderabbit
Copy the configuration below and add it to your repository as .coderabbit.yml
in your project root.
reviews:
path_instructions:
- path: "*.py"
instructions: |
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
Install this rule for Greptile
Greptile rules can be added through the web interface. Please see this documentation for details on how to add custom rules and context.
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
File Path Patterns:
Install this rule for GitHub Copilot
Copilot instructions can be added through the interface. See the documentation for details on how to create coding guidelines.
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
File Path Patterns:
Install this rule for Graphite Diamond
Diamond custom rules can be added through the interface. See the documentation for details on how to create custom rules.
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
File Path Patterns:
Use with Cline
Copy the rule below and ask Cline to review your code using this rule
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
Use with OpenAI Codex
Copy the rule below and ask OpenAI Codex to review your code using this rule
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
Use with Cursor
Copy the rule below and ask Cursor to review your code using this rule
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
Use with Claude Code
Copy the rule below and ask Claude Code to review your code using this rule
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```
Install this rule for Windsurf
To set up rules for Windsurf Reviews, please see this documentation
When creating or dropping indexes in PostgreSQL using SQLAlchemy migrations, always use the `postgresql_concurrently=True` option within an autocommit block. This prevents blocking writes during index operations.
For `upgrade()`:
Bad:
```python
def upgrade():
op.create_index('idx_users_email', 'users', ['email'])
```
Good:
```python
def upgrade():
with op.get_context().autocommit_block():
op.create_index('idx_users_email', 'users', ['email'], postgresql_concurrently=True)
```
For `downgrade()`:
Bad:
```python
def downgrade():
op.drop_index('idx_users_email', 'users')
```
Good:
```python
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('idx_users_email', 'users', postgresql_concurrently=True)
```